Dec 02 19:57:43 crc systemd[1]: Starting Kubernetes Kubelet... Dec 02 19:57:43 crc restorecon[4687]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 19:57:43 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 19:57:44 crc restorecon[4687]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 19:57:44 crc restorecon[4687]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 02 19:57:44 crc kubenswrapper[4807]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 19:57:44 crc kubenswrapper[4807]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 02 19:57:44 crc kubenswrapper[4807]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 19:57:44 crc kubenswrapper[4807]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 19:57:44 crc kubenswrapper[4807]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 02 19:57:44 crc kubenswrapper[4807]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.815245 4807 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818073 4807 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818097 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818101 4807 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818105 4807 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818109 4807 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818114 4807 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818118 4807 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818123 4807 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818127 4807 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818132 4807 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818137 4807 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818141 4807 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818153 4807 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818159 4807 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818164 4807 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818168 4807 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818172 4807 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818175 4807 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818180 4807 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818184 4807 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818187 4807 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818191 4807 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818196 4807 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818202 4807 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818206 4807 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818209 4807 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818214 4807 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818217 4807 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818221 4807 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818224 4807 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818228 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818232 4807 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818235 4807 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818239 4807 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818242 4807 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818246 4807 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818249 4807 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818252 4807 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818256 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818259 4807 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818263 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818267 4807 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818272 4807 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818276 4807 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818280 4807 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818284 4807 feature_gate.go:330] unrecognized feature gate: Example Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818287 4807 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818290 4807 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.818396 4807 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819103 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819118 4807 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819124 4807 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819129 4807 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819134 4807 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819139 4807 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819144 4807 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819149 4807 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819153 4807 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819158 4807 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819162 4807 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819167 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819171 4807 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819183 4807 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819187 4807 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819192 4807 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819197 4807 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819205 4807 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819211 4807 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819217 4807 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819222 4807 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.819227 4807 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819583 4807 flags.go:64] FLAG: --address="0.0.0.0" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819601 4807 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819612 4807 flags.go:64] FLAG: --anonymous-auth="true" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819619 4807 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819626 4807 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819631 4807 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819644 4807 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819650 4807 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819656 4807 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819661 4807 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819666 4807 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819671 4807 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819677 4807 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819684 4807 flags.go:64] FLAG: --cgroup-root="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819692 4807 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819697 4807 flags.go:64] FLAG: --client-ca-file="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819701 4807 flags.go:64] FLAG: --cloud-config="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819706 4807 flags.go:64] FLAG: --cloud-provider="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819711 4807 flags.go:64] FLAG: --cluster-dns="[]" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819737 4807 flags.go:64] FLAG: --cluster-domain="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819744 4807 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819749 4807 flags.go:64] FLAG: --config-dir="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819759 4807 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819766 4807 flags.go:64] FLAG: --container-log-max-files="5" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819773 4807 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819778 4807 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819784 4807 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819789 4807 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819795 4807 flags.go:64] FLAG: --contention-profiling="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819801 4807 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819806 4807 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819817 4807 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819822 4807 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819829 4807 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819834 4807 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819840 4807 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819845 4807 flags.go:64] FLAG: --enable-load-reader="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819851 4807 flags.go:64] FLAG: --enable-server="true" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819857 4807 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819874 4807 flags.go:64] FLAG: --event-burst="100" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819880 4807 flags.go:64] FLAG: --event-qps="50" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819887 4807 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819893 4807 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819899 4807 flags.go:64] FLAG: --eviction-hard="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819908 4807 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819916 4807 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.819923 4807 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820008 4807 flags.go:64] FLAG: --eviction-soft="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820024 4807 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820032 4807 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820040 4807 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820048 4807 flags.go:64] FLAG: --experimental-mounter-path="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820055 4807 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820061 4807 flags.go:64] FLAG: --fail-swap-on="true" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820068 4807 flags.go:64] FLAG: --feature-gates="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820083 4807 flags.go:64] FLAG: --file-check-frequency="20s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820090 4807 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820097 4807 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820106 4807 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820113 4807 flags.go:64] FLAG: --healthz-port="10248" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820119 4807 flags.go:64] FLAG: --help="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820152 4807 flags.go:64] FLAG: --hostname-override="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820158 4807 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820163 4807 flags.go:64] FLAG: --http-check-frequency="20s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820440 4807 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820448 4807 flags.go:64] FLAG: --image-credential-provider-config="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820453 4807 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820459 4807 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820464 4807 flags.go:64] FLAG: --image-service-endpoint="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820468 4807 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820473 4807 flags.go:64] FLAG: --kube-api-burst="100" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820487 4807 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820493 4807 flags.go:64] FLAG: --kube-api-qps="50" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820498 4807 flags.go:64] FLAG: --kube-reserved="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820503 4807 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820508 4807 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820512 4807 flags.go:64] FLAG: --kubelet-cgroups="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820517 4807 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820523 4807 flags.go:64] FLAG: --lock-file="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820528 4807 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820535 4807 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820540 4807 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820549 4807 flags.go:64] FLAG: --log-json-split-stream="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820555 4807 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820560 4807 flags.go:64] FLAG: --log-text-split-stream="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820565 4807 flags.go:64] FLAG: --logging-format="text" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820571 4807 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820577 4807 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820582 4807 flags.go:64] FLAG: --manifest-url="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820587 4807 flags.go:64] FLAG: --manifest-url-header="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820597 4807 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820601 4807 flags.go:64] FLAG: --max-open-files="1000000" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820609 4807 flags.go:64] FLAG: --max-pods="110" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820614 4807 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820618 4807 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820623 4807 flags.go:64] FLAG: --memory-manager-policy="None" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820628 4807 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820632 4807 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820637 4807 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820642 4807 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820676 4807 flags.go:64] FLAG: --node-status-max-images="50" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820681 4807 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820685 4807 flags.go:64] FLAG: --oom-score-adj="-999" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820690 4807 flags.go:64] FLAG: --pod-cidr="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820694 4807 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820732 4807 flags.go:64] FLAG: --pod-manifest-path="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820737 4807 flags.go:64] FLAG: --pod-max-pids="-1" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820742 4807 flags.go:64] FLAG: --pods-per-core="0" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820746 4807 flags.go:64] FLAG: --port="10250" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820751 4807 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820755 4807 flags.go:64] FLAG: --provider-id="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820759 4807 flags.go:64] FLAG: --qos-reserved="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820766 4807 flags.go:64] FLAG: --read-only-port="10255" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820771 4807 flags.go:64] FLAG: --register-node="true" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820775 4807 flags.go:64] FLAG: --register-schedulable="true" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820779 4807 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820789 4807 flags.go:64] FLAG: --registry-burst="10" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820793 4807 flags.go:64] FLAG: --registry-qps="5" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820797 4807 flags.go:64] FLAG: --reserved-cpus="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820802 4807 flags.go:64] FLAG: --reserved-memory="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820807 4807 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820812 4807 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820816 4807 flags.go:64] FLAG: --rotate-certificates="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820821 4807 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820825 4807 flags.go:64] FLAG: --runonce="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820831 4807 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820835 4807 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820841 4807 flags.go:64] FLAG: --seccomp-default="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820845 4807 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820850 4807 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820854 4807 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820859 4807 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820864 4807 flags.go:64] FLAG: --storage-driver-password="root" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820868 4807 flags.go:64] FLAG: --storage-driver-secure="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820873 4807 flags.go:64] FLAG: --storage-driver-table="stats" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820878 4807 flags.go:64] FLAG: --storage-driver-user="root" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820881 4807 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820886 4807 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820890 4807 flags.go:64] FLAG: --system-cgroups="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820894 4807 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820903 4807 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820907 4807 flags.go:64] FLAG: --tls-cert-file="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820912 4807 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820922 4807 flags.go:64] FLAG: --tls-min-version="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820927 4807 flags.go:64] FLAG: --tls-private-key-file="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820931 4807 flags.go:64] FLAG: --topology-manager-policy="none" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820935 4807 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820940 4807 flags.go:64] FLAG: --topology-manager-scope="container" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820944 4807 flags.go:64] FLAG: --v="2" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820952 4807 flags.go:64] FLAG: --version="false" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820958 4807 flags.go:64] FLAG: --vmodule="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820964 4807 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.820969 4807 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821197 4807 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821203 4807 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821207 4807 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821211 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821215 4807 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821218 4807 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821222 4807 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821226 4807 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821230 4807 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821234 4807 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821238 4807 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821241 4807 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821245 4807 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821248 4807 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821259 4807 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821263 4807 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821266 4807 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821270 4807 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821274 4807 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821277 4807 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821284 4807 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821288 4807 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821291 4807 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821296 4807 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821299 4807 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821303 4807 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821308 4807 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821313 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821317 4807 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821321 4807 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821326 4807 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821330 4807 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821335 4807 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821340 4807 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821345 4807 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821350 4807 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821354 4807 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821358 4807 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821363 4807 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821367 4807 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821372 4807 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821376 4807 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821380 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821385 4807 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821389 4807 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821393 4807 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821397 4807 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821401 4807 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821405 4807 feature_gate.go:330] unrecognized feature gate: Example Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821409 4807 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821420 4807 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821423 4807 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821429 4807 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821433 4807 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821437 4807 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821441 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821446 4807 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821450 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821453 4807 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821457 4807 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821461 4807 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821465 4807 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821468 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821472 4807 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821476 4807 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821480 4807 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821483 4807 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821488 4807 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821493 4807 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821497 4807 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.821502 4807 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.821519 4807 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.830550 4807 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.830604 4807 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830681 4807 feature_gate.go:330] unrecognized feature gate: Example Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830691 4807 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830697 4807 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830703 4807 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830707 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830723 4807 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830727 4807 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830733 4807 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830738 4807 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830742 4807 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830746 4807 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830750 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830754 4807 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830760 4807 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830766 4807 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830771 4807 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830777 4807 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830783 4807 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830789 4807 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830794 4807 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830800 4807 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830805 4807 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830810 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830814 4807 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830818 4807 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830822 4807 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830826 4807 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830830 4807 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830834 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830838 4807 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830842 4807 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830846 4807 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830850 4807 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830853 4807 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830859 4807 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830863 4807 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830867 4807 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830871 4807 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830875 4807 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830907 4807 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830912 4807 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830915 4807 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830920 4807 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830925 4807 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830929 4807 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830934 4807 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830938 4807 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830943 4807 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830947 4807 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830951 4807 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830955 4807 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830959 4807 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830963 4807 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830967 4807 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830971 4807 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830975 4807 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830979 4807 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830983 4807 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830987 4807 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830991 4807 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830995 4807 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.830999 4807 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831003 4807 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831007 4807 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831012 4807 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831018 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831023 4807 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831028 4807 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831033 4807 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831037 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831042 4807 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.831050 4807 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831215 4807 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831226 4807 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831232 4807 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831239 4807 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831244 4807 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831250 4807 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831254 4807 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831258 4807 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831264 4807 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831270 4807 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831277 4807 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831282 4807 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831287 4807 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831291 4807 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831296 4807 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831303 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831308 4807 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831312 4807 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831316 4807 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831321 4807 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831325 4807 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831329 4807 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831333 4807 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831337 4807 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831341 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831345 4807 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831349 4807 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831353 4807 feature_gate.go:330] unrecognized feature gate: Example Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831357 4807 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831361 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831365 4807 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831369 4807 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831373 4807 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831377 4807 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831381 4807 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831385 4807 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831389 4807 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831393 4807 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831397 4807 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831401 4807 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831406 4807 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831411 4807 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831417 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831421 4807 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831425 4807 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831429 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831434 4807 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831438 4807 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831443 4807 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831447 4807 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831451 4807 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831455 4807 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831459 4807 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831464 4807 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831468 4807 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831472 4807 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831477 4807 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831482 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831486 4807 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831490 4807 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831494 4807 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831498 4807 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831502 4807 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831506 4807 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831510 4807 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831514 4807 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831518 4807 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831522 4807 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831527 4807 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831530 4807 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.831536 4807 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.831544 4807 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.832036 4807 server.go:940] "Client rotation is on, will bootstrap in background" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.835563 4807 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.835680 4807 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.836494 4807 server.go:997] "Starting client certificate rotation" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.836529 4807 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.836978 4807 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-11 07:07:37.292936801 +0000 UTC Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.837098 4807 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 203h9m52.455841269s for next certificate rotation Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.843850 4807 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.846117 4807 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.859875 4807 log.go:25] "Validated CRI v1 runtime API" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.884576 4807 log.go:25] "Validated CRI v1 image API" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.887121 4807 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.890060 4807 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-02-19-53-00-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.890136 4807 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.911623 4807 manager.go:217] Machine: {Timestamp:2025-12-02 19:57:44.910272049 +0000 UTC m=+0.211179564 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4d376703-634d-4ff9-8cdc-7b05f903cec2 BootID:b38988ab-e6bd-44f1-a049-4d7d2ffee59a Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:47:37:24 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:47:37:24 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:db:00:b3 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b6:b3:6e Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:4a:31:99 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:2e:c2:f9 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:06:b7:f4:9e:0a:07 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:06:c2:f8:cb:87:29 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.911888 4807 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.912121 4807 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.912434 4807 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.912644 4807 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.912696 4807 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.913118 4807 topology_manager.go:138] "Creating topology manager with none policy" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.913132 4807 container_manager_linux.go:303] "Creating device plugin manager" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.913331 4807 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.913387 4807 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.913759 4807 state_mem.go:36] "Initialized new in-memory state store" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.913872 4807 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.914707 4807 kubelet.go:418] "Attempting to sync node with API server" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.914747 4807 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.914773 4807 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.914788 4807 kubelet.go:324] "Adding apiserver pod source" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.914804 4807 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.916819 4807 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.917273 4807 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.917647 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Dec 02 19:57:44 crc kubenswrapper[4807]: E1202 19:57:44.917792 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.918167 4807 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.918144 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Dec 02 19:57:44 crc kubenswrapper[4807]: E1202 19:57:44.918276 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.918748 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.918776 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.918785 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.918793 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.918808 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.918818 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.918827 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.918841 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.918851 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.918860 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.918873 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.918883 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.919151 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.919703 4807 server.go:1280] "Started kubelet" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.919922 4807 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.920107 4807 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.920251 4807 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.920776 4807 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 02 19:57:44 crc systemd[1]: Started Kubernetes Kubelet. Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.922320 4807 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.922373 4807 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.922942 4807 server.go:460] "Adding debug handlers to kubelet server" Dec 02 19:57:44 crc kubenswrapper[4807]: E1202 19:57:44.924104 4807 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.922964 4807 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.922941 4807 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.924371 4807 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.922458 4807 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 02:07:51.701035585 +0000 UTC Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.924544 4807 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 294h10m6.776508806s for next certificate rotation Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.926270 4807 factory.go:55] Registering systemd factory Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.926326 4807 factory.go:221] Registration of the systemd container factory successfully Dec 02 19:57:44 crc kubenswrapper[4807]: E1202 19:57:44.926574 4807 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.36:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d7e4541ba0a31 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 19:57:44.919657009 +0000 UTC m=+0.220564504,LastTimestamp:2025-12-02 19:57:44.919657009 +0000 UTC m=+0.220564504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.927384 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Dec 02 19:57:44 crc kubenswrapper[4807]: E1202 19:57:44.927476 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Dec 02 19:57:44 crc kubenswrapper[4807]: E1202 19:57:44.928098 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="200ms" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.930201 4807 factory.go:153] Registering CRI-O factory Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.930237 4807 factory.go:221] Registration of the crio container factory successfully Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.930336 4807 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.930371 4807 factory.go:103] Registering Raw factory Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.931275 4807 manager.go:1196] Started watching for new ooms in manager Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.932409 4807 manager.go:319] Starting recovery of all containers Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.937541 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.937634 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.937651 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.937666 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.937685 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.937702 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.937752 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.937769 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.937841 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.937854 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.937904 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.937921 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.937936 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.937957 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.937973 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.937990 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938002 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938015 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938067 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938084 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938099 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938114 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938127 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938143 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938163 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938177 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938203 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938248 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938264 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938280 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938293 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938306 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938355 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938371 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938387 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938402 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938417 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938431 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938452 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938467 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938482 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.938500 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940080 4807 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940170 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940195 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940211 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940231 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940245 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940255 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940266 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940276 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940286 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940302 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940322 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940335 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940350 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940366 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940380 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940392 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940403 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940415 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940431 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940440 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940453 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940463 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940477 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940486 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940497 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940507 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940516 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940528 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940541 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940559 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940571 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940584 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940594 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940605 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940617 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940627 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940690 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940702 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940729 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940741 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940753 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940766 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940781 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940794 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940808 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940823 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940834 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940845 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940856 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940868 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940880 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940892 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940903 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940913 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940925 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940937 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940947 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940959 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940969 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940978 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940989 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.940999 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941047 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941060 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941073 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941090 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941103 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941114 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941127 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941137 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941148 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941160 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941173 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941183 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941192 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941203 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941213 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941225 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941235 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941245 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941257 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941268 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941279 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941290 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941300 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941310 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941320 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941333 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941343 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941355 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941366 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941378 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941389 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941403 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941413 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941422 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941434 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941444 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941454 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941465 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941474 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941485 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941495 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941506 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941516 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941527 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941537 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941547 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941557 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941598 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941608 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941617 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941627 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941636 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941646 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941656 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941665 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941673 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941683 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941691 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941701 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941710 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941734 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941743 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941752 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941764 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941774 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941784 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941799 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941809 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941818 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941829 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941840 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941849 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941858 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941869 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941883 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941893 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941904 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941915 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941924 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941935 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941944 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941954 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941964 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941973 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941983 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.941992 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942004 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942013 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942023 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942033 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942042 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942051 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942061 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942070 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942080 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942090 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942098 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942108 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942122 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942131 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942139 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942149 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942157 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942166 4807 reconstruct.go:97] "Volume reconstruction finished" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.942174 4807 reconciler.go:26] "Reconciler: start to sync state" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.956116 4807 manager.go:324] Recovery completed Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.969471 4807 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.970426 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.971016 4807 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.971102 4807 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.971133 4807 kubelet.go:2335] "Starting kubelet main sync loop" Dec 02 19:57:44 crc kubenswrapper[4807]: E1202 19:57:44.971182 4807 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 02 19:57:44 crc kubenswrapper[4807]: W1202 19:57:44.972198 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Dec 02 19:57:44 crc kubenswrapper[4807]: E1202 19:57:44.972354 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.974292 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.974335 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.974350 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.975865 4807 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.975887 4807 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 02 19:57:44 crc kubenswrapper[4807]: I1202 19:57:44.975913 4807 state_mem.go:36] "Initialized new in-memory state store" Dec 02 19:57:45 crc kubenswrapper[4807]: E1202 19:57:45.025172 4807 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 19:57:45 crc kubenswrapper[4807]: E1202 19:57:45.071412 4807 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.080999 4807 policy_none.go:49] "None policy: Start" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.082298 4807 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.082369 4807 state_mem.go:35] "Initializing new in-memory state store" Dec 02 19:57:45 crc kubenswrapper[4807]: E1202 19:57:45.125994 4807 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 19:57:45 crc kubenswrapper[4807]: E1202 19:57:45.129595 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="400ms" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.136895 4807 manager.go:334] "Starting Device Plugin manager" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.137014 4807 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.137073 4807 server.go:79] "Starting device plugin registration server" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.137545 4807 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.137616 4807 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.137795 4807 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.137899 4807 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.137910 4807 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 02 19:57:45 crc kubenswrapper[4807]: E1202 19:57:45.143670 4807 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.238257 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.239923 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.240001 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.240025 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.240065 4807 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 19:57:45 crc kubenswrapper[4807]: E1202 19:57:45.240824 4807 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.36:6443: connect: connection refused" node="crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.272129 4807 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.272463 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.276323 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.276377 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.276390 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.276591 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.276999 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.277202 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.277622 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.277686 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.277699 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.277903 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.278129 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.278208 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.278910 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.278938 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.278948 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.279029 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.279144 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.279183 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.279912 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.279937 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.279947 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.280183 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.280288 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.280240 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.280373 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.280385 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.280359 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.280619 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.280342 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.280652 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.280669 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.280681 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.280535 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.281779 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.281885 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.281921 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.282265 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.282294 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.282305 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.282454 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.282480 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.283542 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.283620 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.283631 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.347656 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.347707 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.347754 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.347772 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.347792 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.347808 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.347937 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.347982 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.348019 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.348046 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.348115 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.348146 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.348171 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.348193 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.348215 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.441553 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.443110 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.443164 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.443185 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.443216 4807 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 19:57:45 crc kubenswrapper[4807]: E1202 19:57:45.443901 4807 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.36:6443: connect: connection refused" node="crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449153 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449197 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449226 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449245 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449266 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449283 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449299 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449337 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449360 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449378 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449382 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449406 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449397 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449458 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449507 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449542 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449563 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449566 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449578 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449588 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449593 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449627 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449610 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449645 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449629 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449668 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449699 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449755 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449769 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.449938 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: E1202 19:57:45.531236 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="800ms" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.621427 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.627791 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: W1202 19:57:45.650077 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-28b4f9b3aba2d9571cf076da50a5769b3219df485049de5f5e35b405a8b39335 WatchSource:0}: Error finding container 28b4f9b3aba2d9571cf076da50a5769b3219df485049de5f5e35b405a8b39335: Status 404 returned error can't find the container with id 28b4f9b3aba2d9571cf076da50a5769b3219df485049de5f5e35b405a8b39335 Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.654453 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.661578 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.666444 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:57:45 crc kubenswrapper[4807]: W1202 19:57:45.667267 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-75a986faf5a7c5f0e571c533643ffdda95576b7f3f022b9ea3259e24b597739e WatchSource:0}: Error finding container 75a986faf5a7c5f0e571c533643ffdda95576b7f3f022b9ea3259e24b597739e: Status 404 returned error can't find the container with id 75a986faf5a7c5f0e571c533643ffdda95576b7f3f022b9ea3259e24b597739e Dec 02 19:57:45 crc kubenswrapper[4807]: W1202 19:57:45.787354 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Dec 02 19:57:45 crc kubenswrapper[4807]: E1202 19:57:45.787977 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Dec 02 19:57:45 crc kubenswrapper[4807]: W1202 19:57:45.824304 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Dec 02 19:57:45 crc kubenswrapper[4807]: E1202 19:57:45.824433 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.844604 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.846471 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.846536 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.846551 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.846582 4807 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 19:57:45 crc kubenswrapper[4807]: E1202 19:57:45.847189 4807 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.36:6443: connect: connection refused" node="crc" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.922073 4807 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Dec 02 19:57:45 crc kubenswrapper[4807]: W1202 19:57:45.934170 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Dec 02 19:57:45 crc kubenswrapper[4807]: E1202 19:57:45.934289 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.978996 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4f82c0663c25c9e46e90a1004d3f90208c413c969ebacfdd7dfed42dbbb33c8f"} Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.980209 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e3fb101497195c2cb01ea0749bf4aa0b51229fe279ff4dfdbfef0de8c793c5c7"} Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.981406 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"75a986faf5a7c5f0e571c533643ffdda95576b7f3f022b9ea3259e24b597739e"} Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.982530 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e2fb80f4473e9089e5e7a67e7846ce36e27677ae3e68335175935c141ecbddfb"} Dec 02 19:57:45 crc kubenswrapper[4807]: I1202 19:57:45.983648 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"28b4f9b3aba2d9571cf076da50a5769b3219df485049de5f5e35b405a8b39335"} Dec 02 19:57:46 crc kubenswrapper[4807]: W1202 19:57:46.132839 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Dec 02 19:57:46 crc kubenswrapper[4807]: E1202 19:57:46.132951 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Dec 02 19:57:46 crc kubenswrapper[4807]: E1202 19:57:46.332121 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="1.6s" Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.648253 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.650680 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.650756 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.650773 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.650812 4807 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 19:57:46 crc kubenswrapper[4807]: E1202 19:57:46.651544 4807 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.36:6443: connect: connection refused" node="crc" Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.921493 4807 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.988188 4807 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb" exitCode=0 Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.988279 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb"} Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.988365 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.989528 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.989572 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.989586 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.992623 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"8b067e9a8d2652bfdd33e8e62427cfc1d871c6d170422a45d5676fbc4cd415f6"} Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.992586 4807 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="8b067e9a8d2652bfdd33e8e62427cfc1d871c6d170422a45d5676fbc4cd415f6" exitCode=0 Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.992899 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.996824 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.996918 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:46 crc kubenswrapper[4807]: I1202 19:57:46.996939 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.000119 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2"} Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.000190 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.000212 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937"} Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.000240 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661"} Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.000264 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c"} Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.002134 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.002211 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.002239 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.003503 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982"} Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.003514 4807 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982" exitCode=0 Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.003781 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.005515 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.005560 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.005574 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.006760 4807 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d0ec3b40ab1ad479c8e5b3a2d838a2451ab3d98217f3c3470312e801c6cdc31d" exitCode=0 Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.006816 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d0ec3b40ab1ad479c8e5b3a2d838a2451ab3d98217f3c3470312e801c6cdc31d"} Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.006914 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.007826 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.007858 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.007874 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.009271 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.010025 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.010058 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.010072 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:47 crc kubenswrapper[4807]: I1202 19:57:47.342557 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:57:47 crc kubenswrapper[4807]: W1202 19:57:47.464327 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.36:6443: connect: connection refused Dec 02 19:57:47 crc kubenswrapper[4807]: E1202 19:57:47.464413 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.36:6443: connect: connection refused" logger="UnhandledError" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.015614 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2e6993537ec825ca91163121a75d7129394b78f55140f7e4f96339a6609486f8"} Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.015745 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8b8def930722eb9a717d4c8945163ad1313a7912ac511b4a4c66b942a012afb0"} Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.015793 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"287123147f1562edac9044a1b110e49241e677f8df9ac9987c0865d0d202f9de"} Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.015922 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.017829 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.017874 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.017884 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.019964 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.019945 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"70220dddbe9bd2bfff2d1e664146673a1d2f33b9dc08f717241446e2baa60d79"} Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.026504 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.026556 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.026576 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.031706 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e"} Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.031835 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015"} Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.031868 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe"} Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.031894 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311"} Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.034203 4807 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a8a209b5cc545eb3cdd20907438fdf72f42aeea488ec59fbc784196fc4687244" exitCode=0 Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.034266 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a8a209b5cc545eb3cdd20907438fdf72f42aeea488ec59fbc784196fc4687244"} Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.034364 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.034369 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.035602 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.035650 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.035667 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.036571 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.036610 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.036626 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.251707 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.254022 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.254074 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.254085 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:48 crc kubenswrapper[4807]: I1202 19:57:48.254118 4807 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.043663 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a"} Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.043813 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.045978 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.046035 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.046056 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.047320 4807 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1c0e4057d28a7d479901d20ad133d05a7e13ca68c235ccdd856616d8798b1bf1" exitCode=0 Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.047432 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.047493 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.047511 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.047577 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1c0e4057d28a7d479901d20ad133d05a7e13ca68c235ccdd856616d8798b1bf1"} Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.047637 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.047525 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.049374 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.049427 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.049454 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.049468 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.049476 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.049528 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.049561 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.049535 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.049623 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.049577 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.049579 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:49 crc kubenswrapper[4807]: I1202 19:57:49.049849 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:50 crc kubenswrapper[4807]: I1202 19:57:50.061860 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"696f8be430216eb181e9e751637a3aa6926e078ceb5f58ab886972805a940510"} Dec 02 19:57:50 crc kubenswrapper[4807]: I1202 19:57:50.062666 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4e7bf180f35b855fcc6138fd0249311026c25bfff0d241a62ef25554569fcb08"} Dec 02 19:57:50 crc kubenswrapper[4807]: I1202 19:57:50.062749 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cc641cc8e6ba84640859b21e1922ee0bbdf945f4b656730064e1b04e0e0eb0a3"} Dec 02 19:57:50 crc kubenswrapper[4807]: I1202 19:57:50.062766 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a80d2c68cf4707fb23efafaa78f8ae794891ba428ac8bb48b6f7ed0182f74bea"} Dec 02 19:57:50 crc kubenswrapper[4807]: I1202 19:57:50.061917 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 19:57:50 crc kubenswrapper[4807]: I1202 19:57:50.062040 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:50 crc kubenswrapper[4807]: I1202 19:57:50.062861 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:50 crc kubenswrapper[4807]: I1202 19:57:50.064230 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:50 crc kubenswrapper[4807]: I1202 19:57:50.064302 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:50 crc kubenswrapper[4807]: I1202 19:57:50.064323 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:50 crc kubenswrapper[4807]: I1202 19:57:50.064312 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:50 crc kubenswrapper[4807]: I1202 19:57:50.064373 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:50 crc kubenswrapper[4807]: I1202 19:57:50.064390 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:50 crc kubenswrapper[4807]: I1202 19:57:50.343569 4807 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 19:57:50 crc kubenswrapper[4807]: I1202 19:57:50.343938 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 19:57:50 crc kubenswrapper[4807]: I1202 19:57:50.565383 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:57:51 crc kubenswrapper[4807]: I1202 19:57:51.071673 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 19:57:51 crc kubenswrapper[4807]: I1202 19:57:51.071678 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f19ac09a73d601fdf51ce1c922b68ae27e509c707b92b5fb77b647ef9a4a95d8"} Dec 02 19:57:51 crc kubenswrapper[4807]: I1202 19:57:51.071744 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:51 crc kubenswrapper[4807]: I1202 19:57:51.071804 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:51 crc kubenswrapper[4807]: I1202 19:57:51.073223 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:51 crc kubenswrapper[4807]: I1202 19:57:51.073329 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:51 crc kubenswrapper[4807]: I1202 19:57:51.073461 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:51 crc kubenswrapper[4807]: I1202 19:57:51.073337 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:51 crc kubenswrapper[4807]: I1202 19:57:51.073599 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:51 crc kubenswrapper[4807]: I1202 19:57:51.073621 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:52 crc kubenswrapper[4807]: I1202 19:57:52.076352 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:52 crc kubenswrapper[4807]: I1202 19:57:52.077805 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:52 crc kubenswrapper[4807]: I1202 19:57:52.077901 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:52 crc kubenswrapper[4807]: I1202 19:57:52.077923 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:53 crc kubenswrapper[4807]: I1202 19:57:53.553552 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 02 19:57:53 crc kubenswrapper[4807]: I1202 19:57:53.553951 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:53 crc kubenswrapper[4807]: I1202 19:57:53.556888 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:53 crc kubenswrapper[4807]: I1202 19:57:53.556940 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:53 crc kubenswrapper[4807]: I1202 19:57:53.556955 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:53 crc kubenswrapper[4807]: I1202 19:57:53.921550 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:57:53 crc kubenswrapper[4807]: I1202 19:57:53.921806 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 19:57:53 crc kubenswrapper[4807]: I1202 19:57:53.921854 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:53 crc kubenswrapper[4807]: I1202 19:57:53.923476 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:53 crc kubenswrapper[4807]: I1202 19:57:53.923510 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:53 crc kubenswrapper[4807]: I1202 19:57:53.923519 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:55 crc kubenswrapper[4807]: E1202 19:57:55.143805 4807 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 19:57:55 crc kubenswrapper[4807]: I1202 19:57:55.548039 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:57:55 crc kubenswrapper[4807]: I1202 19:57:55.548655 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:55 crc kubenswrapper[4807]: I1202 19:57:55.552136 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:55 crc kubenswrapper[4807]: I1202 19:57:55.552198 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:55 crc kubenswrapper[4807]: I1202 19:57:55.552226 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:55 crc kubenswrapper[4807]: I1202 19:57:55.557064 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:57:55 crc kubenswrapper[4807]: I1202 19:57:55.606144 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:57:55 crc kubenswrapper[4807]: I1202 19:57:55.607063 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:55 crc kubenswrapper[4807]: I1202 19:57:55.612200 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:55 crc kubenswrapper[4807]: I1202 19:57:55.612335 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:55 crc kubenswrapper[4807]: I1202 19:57:55.612410 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:55 crc kubenswrapper[4807]: I1202 19:57:55.833289 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:57:56 crc kubenswrapper[4807]: I1202 19:57:56.029511 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:57:56 crc kubenswrapper[4807]: I1202 19:57:56.036347 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:57:56 crc kubenswrapper[4807]: I1202 19:57:56.094368 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:56 crc kubenswrapper[4807]: I1202 19:57:56.095491 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:56 crc kubenswrapper[4807]: I1202 19:57:56.095578 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:56 crc kubenswrapper[4807]: I1202 19:57:56.095601 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:57 crc kubenswrapper[4807]: I1202 19:57:57.098521 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:57:57 crc kubenswrapper[4807]: I1202 19:57:57.100086 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:57:57 crc kubenswrapper[4807]: I1202 19:57:57.100139 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:57:57 crc kubenswrapper[4807]: I1202 19:57:57.100155 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:57:57 crc kubenswrapper[4807]: I1202 19:57:57.922198 4807 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 02 19:57:57 crc kubenswrapper[4807]: E1202 19:57:57.933863 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 02 19:57:58 crc kubenswrapper[4807]: W1202 19:57:58.065969 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 02 19:57:58 crc kubenswrapper[4807]: I1202 19:57:58.066093 4807 trace.go:236] Trace[314242778]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 19:57:48.064) (total time: 10001ms): Dec 02 19:57:58 crc kubenswrapper[4807]: Trace[314242778]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (19:57:58.065) Dec 02 19:57:58 crc kubenswrapper[4807]: Trace[314242778]: [10.00160998s] [10.00160998s] END Dec 02 19:57:58 crc kubenswrapper[4807]: E1202 19:57:58.066122 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 02 19:57:58 crc kubenswrapper[4807]: W1202 19:57:58.179319 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 02 19:57:58 crc kubenswrapper[4807]: I1202 19:57:58.179440 4807 trace.go:236] Trace[2109339507]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 19:57:48.178) (total time: 10001ms): Dec 02 19:57:58 crc kubenswrapper[4807]: Trace[2109339507]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (19:57:58.179) Dec 02 19:57:58 crc kubenswrapper[4807]: Trace[2109339507]: [10.00133935s] [10.00133935s] END Dec 02 19:57:58 crc kubenswrapper[4807]: E1202 19:57:58.179475 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 02 19:57:58 crc kubenswrapper[4807]: E1202 19:57:58.255693 4807 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 02 19:57:58 crc kubenswrapper[4807]: W1202 19:57:58.384935 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 02 19:57:58 crc kubenswrapper[4807]: I1202 19:57:58.385085 4807 trace.go:236] Trace[1875207731]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 19:57:48.382) (total time: 10002ms): Dec 02 19:57:58 crc kubenswrapper[4807]: Trace[1875207731]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (19:57:58.384) Dec 02 19:57:58 crc kubenswrapper[4807]: Trace[1875207731]: [10.00259637s] [10.00259637s] END Dec 02 19:57:58 crc kubenswrapper[4807]: E1202 19:57:58.385126 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 02 19:57:58 crc kubenswrapper[4807]: I1202 19:57:58.576502 4807 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 02 19:57:58 crc kubenswrapper[4807]: I1202 19:57:58.576584 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 02 19:58:00 crc kubenswrapper[4807]: I1202 19:58:00.344770 4807 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 19:58:00 crc kubenswrapper[4807]: I1202 19:58:00.344894 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 19:58:00 crc kubenswrapper[4807]: I1202 19:58:00.429409 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 02 19:58:00 crc kubenswrapper[4807]: I1202 19:58:00.429837 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:58:00 crc kubenswrapper[4807]: I1202 19:58:00.431871 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:00 crc kubenswrapper[4807]: I1202 19:58:00.431963 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:00 crc kubenswrapper[4807]: I1202 19:58:00.431998 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:00 crc kubenswrapper[4807]: I1202 19:58:00.465994 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 02 19:58:01 crc kubenswrapper[4807]: I1202 19:58:01.113233 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:58:01 crc kubenswrapper[4807]: I1202 19:58:01.114308 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:01 crc kubenswrapper[4807]: I1202 19:58:01.114351 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:01 crc kubenswrapper[4807]: I1202 19:58:01.114364 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:01 crc kubenswrapper[4807]: I1202 19:58:01.134159 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 02 19:58:01 crc kubenswrapper[4807]: I1202 19:58:01.456914 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:58:01 crc kubenswrapper[4807]: I1202 19:58:01.458383 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:01 crc kubenswrapper[4807]: I1202 19:58:01.458426 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:01 crc kubenswrapper[4807]: I1202 19:58:01.458435 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:01 crc kubenswrapper[4807]: I1202 19:58:01.458462 4807 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 19:58:01 crc kubenswrapper[4807]: E1202 19:58:01.463283 4807 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 02 19:58:02 crc kubenswrapper[4807]: I1202 19:58:02.115266 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:58:02 crc kubenswrapper[4807]: I1202 19:58:02.116353 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:02 crc kubenswrapper[4807]: I1202 19:58:02.116387 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:02 crc kubenswrapper[4807]: I1202 19:58:02.116402 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:02 crc kubenswrapper[4807]: I1202 19:58:02.288676 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:58:02 crc kubenswrapper[4807]: I1202 19:58:02.288942 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:58:02 crc kubenswrapper[4807]: I1202 19:58:02.290131 4807 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 02 19:58:02 crc kubenswrapper[4807]: I1202 19:58:02.290186 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 02 19:58:02 crc kubenswrapper[4807]: I1202 19:58:02.290315 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:02 crc kubenswrapper[4807]: I1202 19:58:02.290405 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:02 crc kubenswrapper[4807]: I1202 19:58:02.290427 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:02 crc kubenswrapper[4807]: I1202 19:58:02.294709 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:58:02 crc kubenswrapper[4807]: I1202 19:58:02.376540 4807 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.118152 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.118597 4807 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.118662 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.119707 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.119750 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.119765 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.577549 4807 trace.go:236] Trace[877390910]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 19:57:51.519) (total time: 12058ms): Dec 02 19:58:03 crc kubenswrapper[4807]: Trace[877390910]: ---"Objects listed" error: 12058ms (19:58:03.577) Dec 02 19:58:03 crc kubenswrapper[4807]: Trace[877390910]: [12.058458605s] [12.058458605s] END Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.577587 4807 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.578333 4807 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.619432 4807 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.704434 4807 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.927960 4807 apiserver.go:52] "Watching apiserver" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.930852 4807 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.931086 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.931441 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.931617 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:03 crc kubenswrapper[4807]: E1202 19:58:03.931780 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.931861 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.931940 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:03 crc kubenswrapper[4807]: E1202 19:58:03.931982 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.932023 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:03 crc kubenswrapper[4807]: E1202 19:58:03.932094 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.932120 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.936454 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.936512 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.937006 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.937349 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.937374 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.937410 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.937550 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.938707 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.938759 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.968749 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.986644 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 19:58:03 crc kubenswrapper[4807]: I1202 19:58:03.998173 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.009584 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.018673 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.024998 4807 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.031952 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.044748 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.055543 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.082882 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.082950 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.082980 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083013 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083041 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083066 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083092 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083117 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083146 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083169 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083196 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083222 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083247 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083279 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083305 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083333 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083358 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083391 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083427 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083454 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083583 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083613 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083640 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083665 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083667 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083680 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083759 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083759 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083686 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083691 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083914 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083942 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083967 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083995 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084020 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084045 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084068 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084097 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084119 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084148 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084171 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084244 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084272 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084329 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084366 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084389 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084411 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084429 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084449 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084470 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084496 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084525 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084547 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084571 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084595 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084621 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084642 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084666 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084685 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084708 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084777 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084826 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.085229 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.085261 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.085276 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.085585 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.085623 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.085647 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.085667 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.085688 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.085740 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.083965 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084060 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084206 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084251 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084256 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084266 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084277 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084321 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084346 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.086359 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084458 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084488 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084539 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084562 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084588 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084599 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.084841 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.085613 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.085540 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.085874 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.085902 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.085931 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.086001 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.086039 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.086128 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.086216 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.086330 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.086536 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.086569 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.086611 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.086660 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.086693 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.086685 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.087020 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.087047 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.087216 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.087239 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.087604 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.087242 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.087347 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.087467 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.087552 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.087781 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.088989 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.088996 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.089119 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.089343 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.089360 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.089369 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.089408 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.089630 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.089747 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.089755 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 19:58:04.589702965 +0000 UTC m=+19.890610460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.089877 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.090161 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.090182 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.085861 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.090308 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.090328 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.090348 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.090366 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.090639 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.091053 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.091064 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.091219 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.091601 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.091618 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.090383 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.092184 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.092204 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.092219 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.092235 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.092342 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.092467 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.092495 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.092513 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.092548 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.092566 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.092598 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.092615 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.092651 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.093430 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.093952 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.094040 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.094092 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.094132 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.094180 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.094228 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.094262 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.094122 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.096368 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.100671 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.095332 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.100744 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.095911 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.096344 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.096435 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.097740 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.098015 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.098077 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.098286 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.098448 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.098725 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.099123 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.099325 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.099450 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.099500 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.100654 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.101180 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.101340 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.101362 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.101460 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.101507 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.101539 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.101574 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.101671 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.101708 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.101765 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.101796 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.101833 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.101867 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.101857 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.101934 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.101997 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102001 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102033 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102064 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102090 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102159 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102197 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102227 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102253 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102284 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102314 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102342 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102369 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102399 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102423 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102455 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102416 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102492 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102554 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102587 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102583 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102629 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102667 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102693 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102746 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102775 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102754 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102804 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102836 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102882 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102941 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102962 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.102988 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103029 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103078 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103119 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103137 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103268 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103294 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103322 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103346 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103371 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103368 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103393 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103415 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103471 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103523 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103545 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103683 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103704 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103753 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103777 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103794 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103869 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103910 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103908 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103934 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103938 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.103957 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.104026 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.104136 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.104171 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.104200 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.104218 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.104268 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.104245 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.104356 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.104373 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.104386 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.104624 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.104454 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.104850 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.104889 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.104896 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.105042 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.105099 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.105081 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.105172 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.105246 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.105343 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.105381 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.105485 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.105701 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.106284 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.106286 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.106319 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.106809 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.106886 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.106981 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.107021 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.107082 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.107121 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.107157 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.107215 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.107252 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.107283 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.107322 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.107356 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.107389 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.107421 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.107457 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.107912 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.107958 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.107997 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.108038 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.108079 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.108109 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.108145 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.108742 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.108779 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.108832 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.108869 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.108895 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.108926 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109267 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109317 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109354 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109548 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109607 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109650 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109690 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109758 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109801 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109839 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109875 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109911 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109953 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109994 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.110168 4807 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.110706 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.110761 4807 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.110818 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.110838 4807 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.110990 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.111004 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.111018 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.111033 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.111065 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.111748 4807 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.111771 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.111785 4807 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.111828 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.111841 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.111948 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112112 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112126 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112158 4807 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112173 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112182 4807 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112194 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112205 4807 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112237 4807 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112248 4807 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112259 4807 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112273 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112282 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112312 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112333 4807 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112347 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112356 4807 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112387 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112399 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112414 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112424 4807 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112434 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112465 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112482 4807 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112496 4807 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112505 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112515 4807 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112547 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112559 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112590 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112599 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112632 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112642 4807 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112653 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112667 4807 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112701 4807 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112793 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112804 4807 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112872 4807 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112884 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.113060 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.113072 4807 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.113261 4807 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.113300 4807 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114102 4807 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114142 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114232 4807 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114248 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114264 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114292 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114312 4807 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114329 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114344 4807 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114343 4807 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114363 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.115827 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.107905 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.116437 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.116470 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.116505 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.108012 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.108279 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.105919 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.108339 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.108350 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109839 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109950 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.109975 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.110069 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.110105 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.110119 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.111478 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.111585 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.111679 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.111908 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112884 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.112934 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.113111 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.113231 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.113236 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.113272 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.113902 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.113966 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114036 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.116796 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.114226 4807 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.114324 4807 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114275 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114410 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114568 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114904 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114930 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.114964 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.115204 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.115308 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.115617 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.116002 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.119596 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.120032 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.120047 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.120639 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.119425 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121207 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121513 4807 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.121601 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:04.62158007 +0000 UTC m=+19.922487555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.121676 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:04.621668332 +0000 UTC m=+19.922575827 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121691 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121835 4807 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121849 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121861 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121874 4807 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121884 4807 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121895 4807 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121906 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121918 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121929 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121940 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121952 4807 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121962 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121972 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121982 4807 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.121992 4807 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122002 4807 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122011 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122024 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122035 4807 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122045 4807 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122054 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122065 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122074 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122084 4807 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122094 4807 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122104 4807 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122128 4807 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122138 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122149 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122159 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122169 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122179 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122191 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122214 4807 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122224 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122234 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122243 4807 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122252 4807 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122262 4807 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122273 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122644 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122934 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.122967 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.123029 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.123053 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.123267 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.123550 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.123573 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.123862 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.123890 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.124009 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.124426 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.127696 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.127983 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.128088 4807 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.128388 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:04.628298896 +0000 UTC m=+19.929206601 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.129945 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.130079 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.131061 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.131619 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.131776 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.131992 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.132051 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.132378 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.132485 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.132320 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.132644 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.133029 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.133007 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.133540 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.133754 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.134473 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.135241 4807 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a" exitCode=255 Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.135311 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a"} Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.135273 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.136691 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.136940 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.137043 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.137125 4807 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.137271 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:04.63724954 +0000 UTC m=+19.938157235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.139158 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.140674 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.140943 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.141876 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.142103 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.143412 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.144338 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.145363 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.145485 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.145630 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.146509 4807 scope.go:117] "RemoveContainer" containerID="17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.147549 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.147761 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.148180 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.148248 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.157571 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.159105 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.162480 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.187139 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.221184 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.221496 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.226788 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.226849 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.226894 4807 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.226907 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.226918 4807 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.226927 4807 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.226936 4807 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.226946 4807 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.226956 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.226964 4807 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.226974 4807 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.226984 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.226996 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227007 4807 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227018 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227030 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227041 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227049 4807 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227059 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227068 4807 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227077 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227087 4807 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227123 4807 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227132 4807 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227143 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227151 4807 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227160 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227171 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227182 4807 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227192 4807 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227202 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227213 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227245 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227255 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227265 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227275 4807 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227285 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227295 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227304 4807 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227337 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227352 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227369 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227381 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227393 4807 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227408 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227430 4807 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227413 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227450 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227514 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227537 4807 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227693 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227707 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227746 4807 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227758 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227771 4807 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227783 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227795 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227807 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227818 4807 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227837 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227850 4807 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227863 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227874 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227884 4807 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227896 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227908 4807 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227920 4807 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227936 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227947 4807 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227959 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227970 4807 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227984 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227997 4807 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.228010 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.228021 4807 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.228033 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.228044 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.228058 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.228069 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.228081 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.228094 4807 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.228109 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.228122 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.228135 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.228148 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.227379 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.243362 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.247342 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.256619 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.256768 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.264461 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.329094 4807 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.631194 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.631322 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 19:58:05.631305025 +0000 UTC m=+20.932212520 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.631708 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.631785 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.631834 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.631849 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.631856 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.631879 4807 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.631896 4807 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.631920 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:05.631909331 +0000 UTC m=+20.932816826 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.631936 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:05.631928672 +0000 UTC m=+20.932836167 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.631949 4807 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.631970 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:05.631962782 +0000 UTC m=+20.932870277 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.733232 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.733413 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.733446 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.733462 4807 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:04 crc kubenswrapper[4807]: E1202 19:58:04.733531 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:05.733512401 +0000 UTC m=+21.034419896 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.975564 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.976384 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.977145 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.977802 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.978389 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.978951 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.979534 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.981538 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.982431 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.983555 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.984271 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.985498 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.986035 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.986972 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.987494 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.988232 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.991241 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.992191 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.992631 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.993247 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.994493 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.994985 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.996150 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.996615 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.997691 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.998134 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.999205 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 02 19:58:04 crc kubenswrapper[4807]: I1202 19:58:04.999876 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.000328 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.001340 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.001897 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.002795 4807 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.002901 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.005540 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.006529 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.006940 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.008785 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.009431 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.010805 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.011452 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.012556 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.013062 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.014109 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.015154 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.015888 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.016360 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.017395 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.018336 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.019078 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.019604 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.020546 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.021021 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.021993 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.022270 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.022581 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.023080 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.041588 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-bpjf4"] Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.042462 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bpjf4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.044318 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.044705 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.044889 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.049291 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.075246 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.088475 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.100373 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.113333 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.130482 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.136977 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/834825ad-b4fa-4449-92c6-4299aecbaaec-hosts-file\") pod \"node-resolver-bpjf4\" (UID: \"834825ad-b4fa-4449-92c6-4299aecbaaec\") " pod="openshift-dns/node-resolver-bpjf4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.137041 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwnc4\" (UniqueName: \"kubernetes.io/projected/834825ad-b4fa-4449-92c6-4299aecbaaec-kube-api-access-gwnc4\") pod \"node-resolver-bpjf4\" (UID: \"834825ad-b4fa-4449-92c6-4299aecbaaec\") " pod="openshift-dns/node-resolver-bpjf4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.138616 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b"} Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.138668 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"cb705f4165dee17d2975abddc94ef1f0c2180824b50bec6b15f163c228f9dac0"} Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.144379 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.147686 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.150382 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f"} Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.151199 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.156503 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"78d904e00ff740dd333729e1d731c9fd55043babf1b29f57fce82651dcd120e9"} Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.158253 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6"} Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.158284 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f"} Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.158295 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"83bfb3d3589e77eb5dfec4d2150bc4ff947cad554b7b0713a21a00746035a64f"} Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.163780 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.177927 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.194611 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.208785 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.223968 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.238182 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwnc4\" (UniqueName: \"kubernetes.io/projected/834825ad-b4fa-4449-92c6-4299aecbaaec-kube-api-access-gwnc4\") pod \"node-resolver-bpjf4\" (UID: \"834825ad-b4fa-4449-92c6-4299aecbaaec\") " pod="openshift-dns/node-resolver-bpjf4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.238302 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/834825ad-b4fa-4449-92c6-4299aecbaaec-hosts-file\") pod \"node-resolver-bpjf4\" (UID: \"834825ad-b4fa-4449-92c6-4299aecbaaec\") " pod="openshift-dns/node-resolver-bpjf4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.238400 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/834825ad-b4fa-4449-92c6-4299aecbaaec-hosts-file\") pod \"node-resolver-bpjf4\" (UID: \"834825ad-b4fa-4449-92c6-4299aecbaaec\") " pod="openshift-dns/node-resolver-bpjf4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.240419 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.256256 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.271660 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.271865 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwnc4\" (UniqueName: \"kubernetes.io/projected/834825ad-b4fa-4449-92c6-4299aecbaaec-kube-api-access-gwnc4\") pod \"node-resolver-bpjf4\" (UID: \"834825ad-b4fa-4449-92c6-4299aecbaaec\") " pod="openshift-dns/node-resolver-bpjf4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.288251 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.311762 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.326018 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.344739 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.354390 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bpjf4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.365271 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: W1202 19:58:05.373211 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod834825ad_b4fa_4449_92c6_4299aecbaaec.slice/crio-afc264da0ce26c45625e46b0bbc1ac037eb74e4706458f0ce52de96a5a865d8a WatchSource:0}: Error finding container afc264da0ce26c45625e46b0bbc1ac037eb74e4706458f0ce52de96a5a865d8a: Status 404 returned error can't find the container with id afc264da0ce26c45625e46b0bbc1ac037eb74e4706458f0ce52de96a5a865d8a Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.412034 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.460732 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-nxxz4"] Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.461433 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-f5x8r"] Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.461669 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.461755 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.461896 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5plsn"] Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.464371 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.479269 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-wb7h5"] Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.479778 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.487845 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 02 19:58:05 crc kubenswrapper[4807]: W1202 19:58:05.488110 4807 reflector.go:561] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": failed to list *v1.Secret: secrets "machine-config-daemon-dockercfg-r5tcq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.488162 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.488163 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-r5tcq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-config-daemon-dockercfg-r5tcq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.488113 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.488213 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.488132 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.488266 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 02 19:58:05 crc kubenswrapper[4807]: W1202 19:58:05.488344 4807 reflector.go:561] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.488382 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.488403 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.488419 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.488696 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.488873 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 02 19:58:05 crc kubenswrapper[4807]: W1202 19:58:05.489003 4807 reflector.go:561] object-"openshift-machine-config-operator"/"proxy-tls": failed to list *v1.Secret: secrets "proxy-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.489026 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"proxy-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"proxy-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 02 19:58:05 crc kubenswrapper[4807]: W1202 19:58:05.489005 4807 reflector.go:561] object-"openshift-machine-config-operator"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.489054 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.489185 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.489182 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.489242 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.489442 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.489451 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.505098 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.534052 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.541872 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-etc-kubernetes\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.541918 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-openvswitch\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.541940 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-env-overrides\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.541961 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-slash\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.541979 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-ovnkube-script-lib\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.541997 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/600edb6b-1fb6-4946-9d09-a8e5c94045b9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.542016 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9jxp\" (UniqueName: \"kubernetes.io/projected/4aed9271-ad06-407e-b805-80c5dfea98ce-kube-api-access-j9jxp\") pod \"machine-config-daemon-wb7h5\" (UID: \"4aed9271-ad06-407e-b805-80c5dfea98ce\") " pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.542100 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-cnibin\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.542755 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-run-k8s-cni-cncf-io\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.542819 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-run-netns\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.542843 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-var-lib-cni-multus\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.542862 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dxxc\" (UniqueName: \"kubernetes.io/projected/8a909a25-5ede-458e-af78-4a41b79716a5-kube-api-access-7dxxc\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.542883 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-var-lib-openvswitch\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.542970 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-ovn\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.543066 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-cni-netd\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.543119 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdj78\" (UniqueName: \"kubernetes.io/projected/798a6158-a963-43b4-941e-ac4f3df2f883-kube-api-access-tdj78\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.543175 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-systemd\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.543205 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-run-ovn-kubernetes\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.543233 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4aed9271-ad06-407e-b805-80c5dfea98ce-mcd-auth-proxy-config\") pod \"machine-config-daemon-wb7h5\" (UID: \"4aed9271-ad06-407e-b805-80c5dfea98ce\") " pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.543271 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4aed9271-ad06-407e-b805-80c5dfea98ce-rootfs\") pod \"machine-config-daemon-wb7h5\" (UID: \"4aed9271-ad06-407e-b805-80c5dfea98ce\") " pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.543300 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-var-lib-cni-bin\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.543341 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-var-lib-kubelet\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.543368 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4aed9271-ad06-407e-b805-80c5dfea98ce-proxy-tls\") pod \"machine-config-daemon-wb7h5\" (UID: \"4aed9271-ad06-407e-b805-80c5dfea98ce\") " pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.543556 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/600edb6b-1fb6-4946-9d09-a8e5c94045b9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.543817 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-system-cni-dir\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.543860 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8a909a25-5ede-458e-af78-4a41b79716a5-cni-binary-copy\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.543896 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-node-log\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.543931 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.543974 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-ovnkube-config\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544045 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/600edb6b-1fb6-4946-9d09-a8e5c94045b9-system-cni-dir\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544088 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkxb2\" (UniqueName: \"kubernetes.io/projected/600edb6b-1fb6-4946-9d09-a8e5c94045b9-kube-api-access-lkxb2\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544117 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-multus-cni-dir\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544150 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-kubelet\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544179 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-run-netns\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544196 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/798a6158-a963-43b4-941e-ac4f3df2f883-ovn-node-metrics-cert\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544215 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/600edb6b-1fb6-4946-9d09-a8e5c94045b9-cnibin\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544248 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/600edb6b-1fb6-4946-9d09-a8e5c94045b9-cni-binary-copy\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544268 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-systemd-units\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544286 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-run-multus-certs\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544305 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-etc-openvswitch\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544324 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-os-release\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544342 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8a909a25-5ede-458e-af78-4a41b79716a5-multus-daemon-config\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544366 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-multus-socket-dir-parent\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544386 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-hostroot\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544404 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-log-socket\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544419 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-cni-bin\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544448 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-multus-conf-dir\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.544463 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/600edb6b-1fb6-4946-9d09-a8e5c94045b9-os-release\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.601533 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.643441 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.645820 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.645960 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-multus-socket-dir-parent\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.645991 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8a909a25-5ede-458e-af78-4a41b79716a5-multus-daemon-config\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646019 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-cni-bin\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646049 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.646110 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 19:58:07.646066362 +0000 UTC m=+22.946973857 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.646154 4807 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646197 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-hostroot\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.646233 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:07.646212586 +0000 UTC m=+22.947120261 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646147 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-cni-bin\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646263 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-hostroot\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646155 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-multus-socket-dir-parent\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646264 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-log-socket\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646316 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-log-socket\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646352 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-multus-conf-dir\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646380 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/600edb6b-1fb6-4946-9d09-a8e5c94045b9-os-release\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646391 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-multus-conf-dir\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646416 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-openvswitch\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646440 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-env-overrides\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646482 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-etc-kubernetes\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646494 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-openvswitch\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646506 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/600edb6b-1fb6-4946-9d09-a8e5c94045b9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646553 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9jxp\" (UniqueName: \"kubernetes.io/projected/4aed9271-ad06-407e-b805-80c5dfea98ce-kube-api-access-j9jxp\") pod \"machine-config-daemon-wb7h5\" (UID: \"4aed9271-ad06-407e-b805-80c5dfea98ce\") " pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646579 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-cnibin\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646597 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-slash\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646614 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-ovnkube-script-lib\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646633 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-var-lib-cni-multus\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646630 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/600edb6b-1fb6-4946-9d09-a8e5c94045b9-os-release\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646648 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dxxc\" (UniqueName: \"kubernetes.io/projected/8a909a25-5ede-458e-af78-4a41b79716a5-kube-api-access-7dxxc\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646679 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-var-lib-openvswitch\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646675 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-etc-kubernetes\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646765 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-ovn\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646702 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-ovn\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646701 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-cnibin\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646815 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-var-lib-openvswitch\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646824 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-run-k8s-cni-cncf-io\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646858 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-run-netns\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646887 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-cni-netd\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646890 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8a909a25-5ede-458e-af78-4a41b79716a5-multus-daemon-config\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646910 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdj78\" (UniqueName: \"kubernetes.io/projected/798a6158-a963-43b4-941e-ac4f3df2f883-kube-api-access-tdj78\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646926 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-var-lib-cni-multus\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646951 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-systemd\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646956 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-run-netns\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.646972 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-run-ovn-kubernetes\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647000 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4aed9271-ad06-407e-b805-80c5dfea98ce-rootfs\") pod \"machine-config-daemon-wb7h5\" (UID: \"4aed9271-ad06-407e-b805-80c5dfea98ce\") " pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647005 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-run-k8s-cni-cncf-io\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647021 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-slash\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647050 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-run-ovn-kubernetes\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647056 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-systemd\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647024 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4aed9271-ad06-407e-b805-80c5dfea98ce-mcd-auth-proxy-config\") pod \"machine-config-daemon-wb7h5\" (UID: \"4aed9271-ad06-407e-b805-80c5dfea98ce\") " pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647084 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4aed9271-ad06-407e-b805-80c5dfea98ce-rootfs\") pod \"machine-config-daemon-wb7h5\" (UID: \"4aed9271-ad06-407e-b805-80c5dfea98ce\") " pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647002 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-cni-netd\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647101 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-var-lib-kubelet\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647123 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-var-lib-kubelet\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647130 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4aed9271-ad06-407e-b805-80c5dfea98ce-proxy-tls\") pod \"machine-config-daemon-wb7h5\" (UID: \"4aed9271-ad06-407e-b805-80c5dfea98ce\") " pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647137 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-env-overrides\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647162 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-var-lib-cni-bin\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647197 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647223 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/600edb6b-1fb6-4946-9d09-a8e5c94045b9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647230 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-var-lib-cni-bin\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647241 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/600edb6b-1fb6-4946-9d09-a8e5c94045b9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647246 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8a909a25-5ede-458e-af78-4a41b79716a5-cni-binary-copy\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647297 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-node-log\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647319 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.647343 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647361 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-node-log\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.647369 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.647403 4807 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647405 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.647461 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:07.647445728 +0000 UTC m=+22.948353433 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647351 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-ovnkube-config\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647516 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-ovnkube-script-lib\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647524 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-system-cni-dir\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647556 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/600edb6b-1fb6-4946-9d09-a8e5c94045b9-system-cni-dir\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647580 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkxb2\" (UniqueName: \"kubernetes.io/projected/600edb6b-1fb6-4946-9d09-a8e5c94045b9-kube-api-access-lkxb2\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647603 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-kubelet\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647637 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-run-netns\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647646 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/600edb6b-1fb6-4946-9d09-a8e5c94045b9-system-cni-dir\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647665 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/798a6158-a963-43b4-941e-ac4f3df2f883-ovn-node-metrics-cert\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647692 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/600edb6b-1fb6-4946-9d09-a8e5c94045b9-cnibin\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647710 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/600edb6b-1fb6-4946-9d09-a8e5c94045b9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647740 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647781 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-multus-cni-dir\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647799 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-system-cni-dir\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647807 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/600edb6b-1fb6-4946-9d09-a8e5c94045b9-cni-binary-copy\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.647817 4807 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647832 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-systemd-units\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647839 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-run-netns\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647854 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-os-release\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647879 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-run-multus-certs\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647897 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-etc-openvswitch\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647966 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-ovnkube-config\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647972 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-etc-openvswitch\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.648004 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-multus-cni-dir\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.648076 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-os-release\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.648057 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-systemd-units\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.648110 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:07.648100495 +0000 UTC m=+22.949007990 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.648112 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-kubelet\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.648138 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8a909a25-5ede-458e-af78-4a41b79716a5-host-run-multus-certs\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.648139 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/600edb6b-1fb6-4946-9d09-a8e5c94045b9-cnibin\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.647820 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8a909a25-5ede-458e-af78-4a41b79716a5-cni-binary-copy\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.648556 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/600edb6b-1fb6-4946-9d09-a8e5c94045b9-cni-binary-copy\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.657529 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/798a6158-a963-43b4-941e-ac4f3df2f883-ovn-node-metrics-cert\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.678513 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.682406 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dxxc\" (UniqueName: \"kubernetes.io/projected/8a909a25-5ede-458e-af78-4a41b79716a5-kube-api-access-7dxxc\") pod \"multus-f5x8r\" (UID: \"8a909a25-5ede-458e-af78-4a41b79716a5\") " pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.682776 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdj78\" (UniqueName: \"kubernetes.io/projected/798a6158-a963-43b4-941e-ac4f3df2f883-kube-api-access-tdj78\") pod \"ovnkube-node-5plsn\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.697613 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.723037 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.739148 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.748491 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.748692 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.748745 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.748758 4807 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.748811 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:07.748796232 +0000 UTC m=+23.049703727 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.757754 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.775120 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.784137 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkxb2\" (UniqueName: \"kubernetes.io/projected/600edb6b-1fb6-4946-9d09-a8e5c94045b9-kube-api-access-lkxb2\") pod \"multus-additional-cni-plugins-nxxz4\" (UID: \"600edb6b-1fb6-4946-9d09-a8e5c94045b9\") " pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.788558 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.795519 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f5x8r" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.800918 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.815198 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.816465 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.822601 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.830972 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: W1202 19:58:05.838745 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod798a6158_a963_43b4_941e_ac4f3df2f883.slice/crio-35727bb2a991ed917b170c54a85ae1b0b9ecd4d65856f8406ea608b4ce53f23f WatchSource:0}: Error finding container 35727bb2a991ed917b170c54a85ae1b0b9ecd4d65856f8406ea608b4ce53f23f: Status 404 returned error can't find the container with id 35727bb2a991ed917b170c54a85ae1b0b9ecd4d65856f8406ea608b4ce53f23f Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.857144 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.870638 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.885263 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.896753 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.911499 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.927073 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.943730 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.976088 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.976238 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.976654 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.976759 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:05 crc kubenswrapper[4807]: I1202 19:58:05.976819 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:05 crc kubenswrapper[4807]: E1202 19:58:05.976879 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.163433 4807 generic.go:334] "Generic (PLEG): container finished" podID="798a6158-a963-43b4-941e-ac4f3df2f883" containerID="49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849" exitCode=0 Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.163514 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerDied","Data":"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849"} Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.163567 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerStarted","Data":"35727bb2a991ed917b170c54a85ae1b0b9ecd4d65856f8406ea608b4ce53f23f"} Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.164877 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bpjf4" event={"ID":"834825ad-b4fa-4449-92c6-4299aecbaaec","Type":"ContainerStarted","Data":"1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c"} Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.164932 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bpjf4" event={"ID":"834825ad-b4fa-4449-92c6-4299aecbaaec","Type":"ContainerStarted","Data":"afc264da0ce26c45625e46b0bbc1ac037eb74e4706458f0ce52de96a5a865d8a"} Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.167153 4807 generic.go:334] "Generic (PLEG): container finished" podID="600edb6b-1fb6-4946-9d09-a8e5c94045b9" containerID="14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9" exitCode=0 Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.167216 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" event={"ID":"600edb6b-1fb6-4946-9d09-a8e5c94045b9","Type":"ContainerDied","Data":"14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9"} Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.167501 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" event={"ID":"600edb6b-1fb6-4946-9d09-a8e5c94045b9","Type":"ContainerStarted","Data":"1624db9e0a3baef645bdadde6f26ca6f178554faa61c026e65ca02cb6f11db65"} Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.179157 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f5x8r" event={"ID":"8a909a25-5ede-458e-af78-4a41b79716a5","Type":"ContainerStarted","Data":"2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828"} Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.179264 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f5x8r" event={"ID":"8a909a25-5ede-458e-af78-4a41b79716a5","Type":"ContainerStarted","Data":"5cfdabda2adf3c739a1e3663a85803f2808aca77d232ea1fbc652594f92b732b"} Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.187133 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.205170 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.220464 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.233670 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.252631 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.267696 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.296356 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.317310 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.338583 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.352188 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.365318 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.381169 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.394949 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.412151 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.431198 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.448300 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.461364 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.471670 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.485458 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.502893 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.518675 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.532801 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.551611 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.564161 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:06 crc kubenswrapper[4807]: E1202 19:58:06.647356 4807 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 02 19:58:06 crc kubenswrapper[4807]: E1202 19:58:06.647477 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4aed9271-ad06-407e-b805-80c5dfea98ce-mcd-auth-proxy-config podName:4aed9271-ad06-407e-b805-80c5dfea98ce nodeName:}" failed. No retries permitted until 2025-12-02 19:58:07.147442049 +0000 UTC m=+22.448349554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcd-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/4aed9271-ad06-407e-b805-80c5dfea98ce-mcd-auth-proxy-config") pod "machine-config-daemon-wb7h5" (UID: "4aed9271-ad06-407e-b805-80c5dfea98ce") : failed to sync configmap cache: timed out waiting for the condition Dec 02 19:58:06 crc kubenswrapper[4807]: E1202 19:58:06.647546 4807 secret.go:188] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Dec 02 19:58:06 crc kubenswrapper[4807]: E1202 19:58:06.647648 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aed9271-ad06-407e-b805-80c5dfea98ce-proxy-tls podName:4aed9271-ad06-407e-b805-80c5dfea98ce nodeName:}" failed. No retries permitted until 2025-12-02 19:58:07.147625574 +0000 UTC m=+22.448533069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4aed9271-ad06-407e-b805-80c5dfea98ce-proxy-tls") pod "machine-config-daemon-wb7h5" (UID: "4aed9271-ad06-407e-b805-80c5dfea98ce") : failed to sync secret cache: timed out waiting for the condition Dec 02 19:58:06 crc kubenswrapper[4807]: E1202 19:58:06.672983 4807 projected.go:288] Couldn't get configMap openshift-machine-config-operator/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 02 19:58:06 crc kubenswrapper[4807]: E1202 19:58:06.673033 4807 projected.go:194] Error preparing data for projected volume kube-api-access-j9jxp for pod openshift-machine-config-operator/machine-config-daemon-wb7h5: failed to sync configmap cache: timed out waiting for the condition Dec 02 19:58:06 crc kubenswrapper[4807]: E1202 19:58:06.673146 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4aed9271-ad06-407e-b805-80c5dfea98ce-kube-api-access-j9jxp podName:4aed9271-ad06-407e-b805-80c5dfea98ce nodeName:}" failed. No retries permitted until 2025-12-02 19:58:07.173105711 +0000 UTC m=+22.474013216 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-j9jxp" (UniqueName: "kubernetes.io/projected/4aed9271-ad06-407e-b805-80c5dfea98ce-kube-api-access-j9jxp") pod "machine-config-daemon-wb7h5" (UID: "4aed9271-ad06-407e-b805-80c5dfea98ce") : failed to sync configmap cache: timed out waiting for the condition Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.798287 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.855131 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 19:58:06 crc kubenswrapper[4807]: I1202 19:58:06.969026 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.020454 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.164531 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4aed9271-ad06-407e-b805-80c5dfea98ce-mcd-auth-proxy-config\") pod \"machine-config-daemon-wb7h5\" (UID: \"4aed9271-ad06-407e-b805-80c5dfea98ce\") " pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.164606 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4aed9271-ad06-407e-b805-80c5dfea98ce-proxy-tls\") pod \"machine-config-daemon-wb7h5\" (UID: \"4aed9271-ad06-407e-b805-80c5dfea98ce\") " pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.165530 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4aed9271-ad06-407e-b805-80c5dfea98ce-mcd-auth-proxy-config\") pod \"machine-config-daemon-wb7h5\" (UID: \"4aed9271-ad06-407e-b805-80c5dfea98ce\") " pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.171426 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4aed9271-ad06-407e-b805-80c5dfea98ce-proxy-tls\") pod \"machine-config-daemon-wb7h5\" (UID: \"4aed9271-ad06-407e-b805-80c5dfea98ce\") " pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.185501 4807 generic.go:334] "Generic (PLEG): container finished" podID="600edb6b-1fb6-4946-9d09-a8e5c94045b9" containerID="917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31" exitCode=0 Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.185587 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" event={"ID":"600edb6b-1fb6-4946-9d09-a8e5c94045b9","Type":"ContainerDied","Data":"917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31"} Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.193426 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerStarted","Data":"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a"} Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.193923 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerStarted","Data":"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a"} Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.193940 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerStarted","Data":"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db"} Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.193953 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerStarted","Data":"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5"} Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.193967 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerStarted","Data":"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5"} Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.193980 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerStarted","Data":"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def"} Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.201703 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1"} Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.206420 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.221707 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.237040 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.251347 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.265292 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.265519 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9jxp\" (UniqueName: \"kubernetes.io/projected/4aed9271-ad06-407e-b805-80c5dfea98ce-kube-api-access-j9jxp\") pod \"machine-config-daemon-wb7h5\" (UID: \"4aed9271-ad06-407e-b805-80c5dfea98ce\") " pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.269709 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9jxp\" (UniqueName: \"kubernetes.io/projected/4aed9271-ad06-407e-b805-80c5dfea98ce-kube-api-access-j9jxp\") pod \"machine-config-daemon-wb7h5\" (UID: \"4aed9271-ad06-407e-b805-80c5dfea98ce\") " pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.283623 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.296945 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.313498 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.326767 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.335170 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.337250 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: W1202 19:58:07.348397 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aed9271_ad06_407e_b805_80c5dfea98ce.slice/crio-2c2292278977f148df29c4e4b3fcc3245e49a9f10a42fb125be151c19bad4321 WatchSource:0}: Error finding container 2c2292278977f148df29c4e4b3fcc3245e49a9f10a42fb125be151c19bad4321: Status 404 returned error can't find the container with id 2c2292278977f148df29c4e4b3fcc3245e49a9f10a42fb125be151c19bad4321 Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.348465 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.352801 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.353027 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.358588 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.367088 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.386626 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.402808 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.417081 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.427738 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-4jpt9"] Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.428246 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4jpt9" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.431184 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.431314 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.431367 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.432473 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.439103 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.451600 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.468302 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/980d1fa3-8069-42bc-9510-44148f64cab6-serviceca\") pod \"node-ca-4jpt9\" (UID: \"980d1fa3-8069-42bc-9510-44148f64cab6\") " pod="openshift-image-registry/node-ca-4jpt9" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.468389 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5v5t\" (UniqueName: \"kubernetes.io/projected/980d1fa3-8069-42bc-9510-44148f64cab6-kube-api-access-t5v5t\") pod \"node-ca-4jpt9\" (UID: \"980d1fa3-8069-42bc-9510-44148f64cab6\") " pod="openshift-image-registry/node-ca-4jpt9" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.468416 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/980d1fa3-8069-42bc-9510-44148f64cab6-host\") pod \"node-ca-4jpt9\" (UID: \"980d1fa3-8069-42bc-9510-44148f64cab6\") " pod="openshift-image-registry/node-ca-4jpt9" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.469382 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.484945 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.502500 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.516807 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.541706 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.556393 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.569946 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/980d1fa3-8069-42bc-9510-44148f64cab6-serviceca\") pod \"node-ca-4jpt9\" (UID: \"980d1fa3-8069-42bc-9510-44148f64cab6\") " pod="openshift-image-registry/node-ca-4jpt9" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.570008 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5v5t\" (UniqueName: \"kubernetes.io/projected/980d1fa3-8069-42bc-9510-44148f64cab6-kube-api-access-t5v5t\") pod \"node-ca-4jpt9\" (UID: \"980d1fa3-8069-42bc-9510-44148f64cab6\") " pod="openshift-image-registry/node-ca-4jpt9" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.570028 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/980d1fa3-8069-42bc-9510-44148f64cab6-host\") pod \"node-ca-4jpt9\" (UID: \"980d1fa3-8069-42bc-9510-44148f64cab6\") " pod="openshift-image-registry/node-ca-4jpt9" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.569992 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.570097 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/980d1fa3-8069-42bc-9510-44148f64cab6-host\") pod \"node-ca-4jpt9\" (UID: \"980d1fa3-8069-42bc-9510-44148f64cab6\") " pod="openshift-image-registry/node-ca-4jpt9" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.571275 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/980d1fa3-8069-42bc-9510-44148f64cab6-serviceca\") pod \"node-ca-4jpt9\" (UID: \"980d1fa3-8069-42bc-9510-44148f64cab6\") " pod="openshift-image-registry/node-ca-4jpt9" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.585001 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.586610 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5v5t\" (UniqueName: \"kubernetes.io/projected/980d1fa3-8069-42bc-9510-44148f64cab6-kube-api-access-t5v5t\") pod \"node-ca-4jpt9\" (UID: \"980d1fa3-8069-42bc-9510-44148f64cab6\") " pod="openshift-image-registry/node-ca-4jpt9" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.599323 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.617223 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.631804 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.644943 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.660806 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.670658 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.670840 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.670910 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 19:58:11.670878844 +0000 UTC m=+26.971786339 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.670998 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.671013 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.671026 4807 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.671063 4807 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.671066 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:11.671058109 +0000 UTC m=+26.971965594 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.671109 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:11.67110027 +0000 UTC m=+26.972007765 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.670997 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.671182 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.671240 4807 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.671269 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:11.671261034 +0000 UTC m=+26.972168519 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.679940 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.725052 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.742395 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4jpt9" Dec 02 19:58:07 crc kubenswrapper[4807]: W1202 19:58:07.755805 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod980d1fa3_8069_42bc_9510_44148f64cab6.slice/crio-b9b0fcfb54580e99be96d4cee090d7caef4660c8193e70df1f6e2d81d4dc8c47 WatchSource:0}: Error finding container b9b0fcfb54580e99be96d4cee090d7caef4660c8193e70df1f6e2d81d4dc8c47: Status 404 returned error can't find the container with id b9b0fcfb54580e99be96d4cee090d7caef4660c8193e70df1f6e2d81d4dc8c47 Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.772206 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.772406 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.772433 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.772450 4807 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.772503 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:11.772486574 +0000 UTC m=+27.073394089 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.772800 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.804448 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.848689 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.864205 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.866820 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.866869 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.866883 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.867098 4807 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.885266 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.933820 4807 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.934838 4807 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.937314 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.937352 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.937365 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.937382 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.937394 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:07Z","lastTransitionTime":"2025-12-02T19:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.956525 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.960766 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.960805 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.960817 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.960846 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.960859 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:07Z","lastTransitionTime":"2025-12-02T19:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.963530 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.972146 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.972462 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.972527 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.972636 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.972871 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.973000 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.974155 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.978001 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.978029 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.978044 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.978063 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:07 crc kubenswrapper[4807]: I1202 19:58:07.978077 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:07Z","lastTransitionTime":"2025-12-02T19:58:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:07 crc kubenswrapper[4807]: E1202 19:58:07.992470 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:07Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.004743 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.004830 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.004847 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.004899 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.004914 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:08Z","lastTransitionTime":"2025-12-02T19:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.010311 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: E1202 19:58:08.021523 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.025465 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.025505 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.025516 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.025543 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.025558 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:08Z","lastTransitionTime":"2025-12-02T19:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:08 crc kubenswrapper[4807]: E1202 19:58:08.037945 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: E1202 19:58:08.038063 4807 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.040133 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.040186 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.040199 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.040219 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.040231 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:08Z","lastTransitionTime":"2025-12-02T19:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.049110 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.143356 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.143400 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.143411 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.143429 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.143443 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:08Z","lastTransitionTime":"2025-12-02T19:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.208761 4807 generic.go:334] "Generic (PLEG): container finished" podID="600edb6b-1fb6-4946-9d09-a8e5c94045b9" containerID="066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af" exitCode=0 Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.208875 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" event={"ID":"600edb6b-1fb6-4946-9d09-a8e5c94045b9","Type":"ContainerDied","Data":"066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af"} Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.210837 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4jpt9" event={"ID":"980d1fa3-8069-42bc-9510-44148f64cab6","Type":"ContainerStarted","Data":"b9b0fcfb54580e99be96d4cee090d7caef4660c8193e70df1f6e2d81d4dc8c47"} Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.213172 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb"} Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.213214 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079"} Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.213227 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"2c2292278977f148df29c4e4b3fcc3245e49a9f10a42fb125be151c19bad4321"} Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.220675 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: E1202 19:58:08.222004 4807 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.233116 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.244609 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.245594 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.245635 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.245652 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.245677 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.245695 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:08Z","lastTransitionTime":"2025-12-02T19:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.260077 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.273350 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.306294 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.345338 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.348330 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.348594 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.348613 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.348637 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.348650 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:08Z","lastTransitionTime":"2025-12-02T19:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.384990 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.424342 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.451548 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.451596 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.451611 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.451633 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.451650 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:08Z","lastTransitionTime":"2025-12-02T19:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.463619 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.507849 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.542836 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.561704 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.561761 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.561774 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.561794 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.561809 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:08Z","lastTransitionTime":"2025-12-02T19:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.586083 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.630627 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.665572 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.665704 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.665729 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.665747 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.665808 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:08Z","lastTransitionTime":"2025-12-02T19:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.686948 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.701546 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.741043 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.768220 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.768256 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.768264 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.768280 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.768290 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:08Z","lastTransitionTime":"2025-12-02T19:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.782964 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.821147 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.861823 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.870865 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.870922 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.870934 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.870955 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.870972 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:08Z","lastTransitionTime":"2025-12-02T19:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.903982 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.945047 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.973480 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.973532 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.973545 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.973564 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.973578 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:08Z","lastTransitionTime":"2025-12-02T19:58:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:08 crc kubenswrapper[4807]: I1202 19:58:08.982030 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:08Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.023350 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.070027 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.076187 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.076250 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.076273 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.076296 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.076311 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:09Z","lastTransitionTime":"2025-12-02T19:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.104219 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.143044 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.178864 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.178911 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.178922 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.178939 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.178951 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:09Z","lastTransitionTime":"2025-12-02T19:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.182404 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.221865 4807 generic.go:334] "Generic (PLEG): container finished" podID="600edb6b-1fb6-4946-9d09-a8e5c94045b9" containerID="39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32" exitCode=0 Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.221946 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" event={"ID":"600edb6b-1fb6-4946-9d09-a8e5c94045b9","Type":"ContainerDied","Data":"39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32"} Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.228972 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4jpt9" event={"ID":"980d1fa3-8069-42bc-9510-44148f64cab6","Type":"ContainerStarted","Data":"53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5"} Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.234679 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerStarted","Data":"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e"} Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.239770 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.261654 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.281148 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.281217 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.281237 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.281263 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.281283 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:09Z","lastTransitionTime":"2025-12-02T19:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.302099 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.342095 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.380841 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.384182 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.384246 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.384266 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.384289 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.384302 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:09Z","lastTransitionTime":"2025-12-02T19:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.424655 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.461705 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.488296 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.488752 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.488766 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.488785 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.488798 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:09Z","lastTransitionTime":"2025-12-02T19:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.501702 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.549668 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.582254 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.591632 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.591687 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.591702 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.591742 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.591757 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:09Z","lastTransitionTime":"2025-12-02T19:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.625580 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.665787 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.694926 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.694972 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.694982 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.694999 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.695010 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:09Z","lastTransitionTime":"2025-12-02T19:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.707006 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.744097 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.782498 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.797737 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.797784 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.797810 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.797826 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.797840 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:09Z","lastTransitionTime":"2025-12-02T19:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.823234 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.861258 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.899656 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.900585 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.900623 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.900633 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.900650 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.900662 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:09Z","lastTransitionTime":"2025-12-02T19:58:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.947159 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.971582 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.971631 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.971582 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:09 crc kubenswrapper[4807]: E1202 19:58:09.971745 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:09 crc kubenswrapper[4807]: E1202 19:58:09.971830 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:09 crc kubenswrapper[4807]: E1202 19:58:09.971898 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:09 crc kubenswrapper[4807]: I1202 19:58:09.983277 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.003836 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.003888 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.003901 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.003923 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.003938 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:10Z","lastTransitionTime":"2025-12-02T19:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.026589 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.062668 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.106072 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.107040 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.107086 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.107095 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.107113 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.107124 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:10Z","lastTransitionTime":"2025-12-02T19:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.143404 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.183191 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.210464 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.210550 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.210577 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.210610 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.210636 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:10Z","lastTransitionTime":"2025-12-02T19:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.227685 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.243757 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" event={"ID":"600edb6b-1fb6-4946-9d09-a8e5c94045b9","Type":"ContainerDied","Data":"e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656"} Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.243757 4807 generic.go:334] "Generic (PLEG): container finished" podID="600edb6b-1fb6-4946-9d09-a8e5c94045b9" containerID="e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656" exitCode=0 Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.262098 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.315325 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.315377 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.315389 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.315409 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.315425 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:10Z","lastTransitionTime":"2025-12-02T19:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.318192 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.343507 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.387171 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.418780 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.418832 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.418843 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.418862 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.418874 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:10Z","lastTransitionTime":"2025-12-02T19:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.421486 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.465657 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.506497 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.522299 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.522357 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.522375 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.522400 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.522419 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:10Z","lastTransitionTime":"2025-12-02T19:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.549457 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.582414 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.624453 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.627063 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.627123 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.627136 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.627161 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.627176 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:10Z","lastTransitionTime":"2025-12-02T19:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.666258 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.707774 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.730510 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.730604 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.730639 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.730676 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.730697 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:10Z","lastTransitionTime":"2025-12-02T19:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.750351 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.784049 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.831678 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.834293 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.834380 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.834409 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.834443 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.834470 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:10Z","lastTransitionTime":"2025-12-02T19:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.867779 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:10Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.937793 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.937856 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.937869 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.937891 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:10 crc kubenswrapper[4807]: I1202 19:58:10.937912 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:10Z","lastTransitionTime":"2025-12-02T19:58:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.041178 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.041251 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.041278 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.041306 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.041329 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:11Z","lastTransitionTime":"2025-12-02T19:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.145412 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.145918 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.145946 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.145982 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.146012 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:11Z","lastTransitionTime":"2025-12-02T19:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.248121 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.248164 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.248175 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.248192 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.248204 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:11Z","lastTransitionTime":"2025-12-02T19:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.253135 4807 generic.go:334] "Generic (PLEG): container finished" podID="600edb6b-1fb6-4946-9d09-a8e5c94045b9" containerID="541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21" exitCode=0 Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.253207 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" event={"ID":"600edb6b-1fb6-4946-9d09-a8e5c94045b9","Type":"ContainerDied","Data":"541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21"} Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.262026 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerStarted","Data":"e267813d3499d2bc0c176853689b0f8647c336ed669b096fccd26beeb63c444c"} Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.262542 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.262588 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.262604 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.269667 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.285456 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.289933 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.296442 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.297450 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.315098 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.327809 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.338938 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.352755 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.352818 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.352865 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.352889 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.352904 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:11Z","lastTransitionTime":"2025-12-02T19:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.362164 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.376458 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.395052 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.409764 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.426126 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.440746 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.453855 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.456777 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.456819 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.456828 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.456845 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.456856 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:11Z","lastTransitionTime":"2025-12-02T19:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.471516 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.485585 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.502667 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.543299 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.559846 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.559925 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.559937 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.559955 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.559966 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:11Z","lastTransitionTime":"2025-12-02T19:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.583541 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.624883 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.663156 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.663210 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.663224 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.663250 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.663265 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:11Z","lastTransitionTime":"2025-12-02T19:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.666890 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e267813d3499d2bc0c176853689b0f8647c336ed669b096fccd26beeb63c444c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.719295 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.719466 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:11 crc kubenswrapper[4807]: E1202 19:58:11.719491 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 19:58:19.71945669 +0000 UTC m=+35.020364185 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.719527 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.719567 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:11 crc kubenswrapper[4807]: E1202 19:58:11.719582 4807 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 19:58:11 crc kubenswrapper[4807]: E1202 19:58:11.719635 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:19.719623695 +0000 UTC m=+35.020531210 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 19:58:11 crc kubenswrapper[4807]: E1202 19:58:11.720339 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 19:58:11 crc kubenswrapper[4807]: E1202 19:58:11.720414 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 19:58:11 crc kubenswrapper[4807]: E1202 19:58:11.720433 4807 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:11 crc kubenswrapper[4807]: E1202 19:58:11.720371 4807 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 19:58:11 crc kubenswrapper[4807]: E1202 19:58:11.720512 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:19.720487017 +0000 UTC m=+35.021394712 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:11 crc kubenswrapper[4807]: E1202 19:58:11.720660 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:19.720629381 +0000 UTC m=+35.021537026 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.721294 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.754239 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.766763 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.766811 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.766823 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.766843 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.766859 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:11Z","lastTransitionTime":"2025-12-02T19:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.781701 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.821080 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:11 crc kubenswrapper[4807]: E1202 19:58:11.821321 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 19:58:11 crc kubenswrapper[4807]: E1202 19:58:11.821372 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 19:58:11 crc kubenswrapper[4807]: E1202 19:58:11.821395 4807 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:11 crc kubenswrapper[4807]: E1202 19:58:11.821482 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:19.821459571 +0000 UTC m=+35.122367066 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.822270 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.863240 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.869431 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.869467 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.869499 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.869518 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.869532 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:11Z","lastTransitionTime":"2025-12-02T19:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.904652 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.948769 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.971339 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.971393 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.971456 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:11 crc kubenswrapper[4807]: E1202 19:58:11.972981 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:11 crc kubenswrapper[4807]: E1202 19:58:11.973296 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:11 crc kubenswrapper[4807]: E1202 19:58:11.973574 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.977825 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.977936 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.977965 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.977997 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.978040 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:11Z","lastTransitionTime":"2025-12-02T19:58:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:11 crc kubenswrapper[4807]: I1202 19:58:11.988308 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:11Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.080883 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.080936 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.080951 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.080973 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.080987 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:12Z","lastTransitionTime":"2025-12-02T19:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.184645 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.184693 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.184704 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.184746 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.184759 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:12Z","lastTransitionTime":"2025-12-02T19:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.271440 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" event={"ID":"600edb6b-1fb6-4946-9d09-a8e5c94045b9","Type":"ContainerStarted","Data":"ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb"} Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.287757 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.287791 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.287799 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.287813 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.287822 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:12Z","lastTransitionTime":"2025-12-02T19:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.390479 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.390546 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.390560 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.390582 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.390599 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:12Z","lastTransitionTime":"2025-12-02T19:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.493522 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.493598 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.493618 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.493649 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.493672 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:12Z","lastTransitionTime":"2025-12-02T19:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.596165 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.596208 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.596218 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.596239 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.596248 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:12Z","lastTransitionTime":"2025-12-02T19:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.700277 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.700738 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.700758 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.700778 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.700788 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:12Z","lastTransitionTime":"2025-12-02T19:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.804083 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.804147 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.804161 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.804181 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.804193 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:12Z","lastTransitionTime":"2025-12-02T19:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.907485 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.908305 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.908356 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.908385 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:12 crc kubenswrapper[4807]: I1202 19:58:12.908413 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:12Z","lastTransitionTime":"2025-12-02T19:58:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.012226 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.012280 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.012291 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.012310 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.012325 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:13Z","lastTransitionTime":"2025-12-02T19:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.115145 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.115195 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.115208 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.115227 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.115244 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:13Z","lastTransitionTime":"2025-12-02T19:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.217901 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.217950 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.217963 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.217982 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.217995 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:13Z","lastTransitionTime":"2025-12-02T19:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.302352 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.317081 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.321174 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.321229 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.321247 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.321272 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.321289 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:13Z","lastTransitionTime":"2025-12-02T19:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.332864 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.350264 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.379973 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e267813d3499d2bc0c176853689b0f8647c336ed669b096fccd26beeb63c444c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.396408 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.415542 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.424445 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.424504 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.424514 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.424533 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.424545 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:13Z","lastTransitionTime":"2025-12-02T19:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.434756 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.455003 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.470596 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.487000 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.502863 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.518588 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.527436 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.527508 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.527521 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.527541 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.527558 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:13Z","lastTransitionTime":"2025-12-02T19:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.540382 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:13Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.630353 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.630405 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.630416 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.630433 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.630444 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:13Z","lastTransitionTime":"2025-12-02T19:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.732871 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.732924 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.732935 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.732953 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.732965 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:13Z","lastTransitionTime":"2025-12-02T19:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.835406 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.835450 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.835465 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.835482 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.835493 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:13Z","lastTransitionTime":"2025-12-02T19:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.938707 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.938772 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.938784 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.938803 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.938818 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:13Z","lastTransitionTime":"2025-12-02T19:58:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.971471 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.971528 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:13 crc kubenswrapper[4807]: I1202 19:58:13.971505 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:13 crc kubenswrapper[4807]: E1202 19:58:13.971682 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:13 crc kubenswrapper[4807]: E1202 19:58:13.971839 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:13 crc kubenswrapper[4807]: E1202 19:58:13.971929 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.042482 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.042548 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.042564 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.042589 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.042603 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:14Z","lastTransitionTime":"2025-12-02T19:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.145265 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.145298 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.145307 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.145325 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.145339 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:14Z","lastTransitionTime":"2025-12-02T19:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.248744 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.248796 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.248810 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.248833 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.248848 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:14Z","lastTransitionTime":"2025-12-02T19:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.352355 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.352410 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.352434 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.352455 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.352468 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:14Z","lastTransitionTime":"2025-12-02T19:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.455190 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.455763 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.455783 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.455806 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.455821 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:14Z","lastTransitionTime":"2025-12-02T19:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.559197 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.559644 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.559892 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.560082 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.560242 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:14Z","lastTransitionTime":"2025-12-02T19:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.663665 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.664099 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.664500 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.664758 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.664937 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:14Z","lastTransitionTime":"2025-12-02T19:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.767961 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.768377 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.768531 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.768687 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.768875 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:14Z","lastTransitionTime":"2025-12-02T19:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.872758 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.872820 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.872836 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.872856 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.872870 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:14Z","lastTransitionTime":"2025-12-02T19:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.977063 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.977149 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.977166 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.977190 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.977211 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:14Z","lastTransitionTime":"2025-12-02T19:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:14 crc kubenswrapper[4807]: I1202 19:58:14.992183 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:14Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.011414 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.036051 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e267813d3499d2bc0c176853689b0f8647c336ed669b096fccd26beeb63c444c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.053417 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.069452 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.080452 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.080512 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.080541 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.080566 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.080584 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:15Z","lastTransitionTime":"2025-12-02T19:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.090803 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.107210 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.125176 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.143871 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.161833 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.183683 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.183829 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.183866 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.183882 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.183902 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.183918 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:15Z","lastTransitionTime":"2025-12-02T19:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.200528 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.223553 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.248024 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.286534 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.286630 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.286647 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.286674 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.286691 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:15Z","lastTransitionTime":"2025-12-02T19:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.288308 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/0.log" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.292773 4807 generic.go:334] "Generic (PLEG): container finished" podID="798a6158-a963-43b4-941e-ac4f3df2f883" containerID="e267813d3499d2bc0c176853689b0f8647c336ed669b096fccd26beeb63c444c" exitCode=1 Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.293046 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerDied","Data":"e267813d3499d2bc0c176853689b0f8647c336ed669b096fccd26beeb63c444c"} Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.295904 4807 scope.go:117] "RemoveContainer" containerID="e267813d3499d2bc0c176853689b0f8647c336ed669b096fccd26beeb63c444c" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.314493 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.329800 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.345840 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.363795 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.377953 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.389400 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.389456 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.389485 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.389506 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.389517 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:15Z","lastTransitionTime":"2025-12-02T19:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.395823 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.415768 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.437257 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.455631 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.470553 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.489593 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e267813d3499d2bc0c176853689b0f8647c336ed669b096fccd26beeb63c444c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e267813d3499d2bc0c176853689b0f8647c336ed669b096fccd26beeb63c444c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:14Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:58:14.135257 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 19:58:14.135754 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 19:58:14.135785 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 19:58:14.135826 6104 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:58:14.136205 6104 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:58:14.136416 6104 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 19:58:14.136542 6104 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 19:58:14.136890 6104 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 19:58:14.136962 6104 factory.go:656] Stopping watch factory\\\\nI1202 19:58:14.136982 6104 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.492879 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.492937 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.492947 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.492963 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.492972 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:15Z","lastTransitionTime":"2025-12-02T19:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.503827 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.517639 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.535069 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.596058 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.596100 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.596111 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.596133 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.596145 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:15Z","lastTransitionTime":"2025-12-02T19:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.614579 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.631669 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.657446 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.675971 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.694647 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.698689 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.698764 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.698779 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.698797 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.698808 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:15Z","lastTransitionTime":"2025-12-02T19:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.717619 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.734926 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.747061 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.758548 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.771932 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.784266 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.801752 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.801797 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.801807 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.801822 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.801834 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:15Z","lastTransitionTime":"2025-12-02T19:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.803776 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e267813d3499d2bc0c176853689b0f8647c336ed669b096fccd26beeb63c444c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e267813d3499d2bc0c176853689b0f8647c336ed669b096fccd26beeb63c444c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:14Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:58:14.135257 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 19:58:14.135754 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 19:58:14.135785 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 19:58:14.135826 6104 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:58:14.136205 6104 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:58:14.136416 6104 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 19:58:14.136542 6104 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 19:58:14.136890 6104 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 19:58:14.136962 6104 factory.go:656] Stopping watch factory\\\\nI1202 19:58:14.136982 6104 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.816824 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.830522 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.848646 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:15Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.907532 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.907583 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.907606 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.907622 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.907634 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:15Z","lastTransitionTime":"2025-12-02T19:58:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.971613 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.971671 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:15 crc kubenswrapper[4807]: I1202 19:58:15.971618 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:15 crc kubenswrapper[4807]: E1202 19:58:15.971777 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:15 crc kubenswrapper[4807]: E1202 19:58:15.972004 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:15 crc kubenswrapper[4807]: E1202 19:58:15.972178 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.010178 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.010224 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.010237 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.010252 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.010264 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:16Z","lastTransitionTime":"2025-12-02T19:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.112995 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.113035 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.113045 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.113061 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.113072 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:16Z","lastTransitionTime":"2025-12-02T19:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.216458 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.216520 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.216537 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.216564 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.216589 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:16Z","lastTransitionTime":"2025-12-02T19:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.300279 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/0.log" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.303671 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerStarted","Data":"45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700"} Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.304939 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.320524 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.320606 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.320620 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.320643 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.320660 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:16Z","lastTransitionTime":"2025-12-02T19:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.327762 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.347840 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.365300 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.383452 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.397915 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.413427 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.423561 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.423605 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.423613 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.423629 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.423642 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:16Z","lastTransitionTime":"2025-12-02T19:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.429473 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.447443 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.462861 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.479706 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.504076 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e267813d3499d2bc0c176853689b0f8647c336ed669b096fccd26beeb63c444c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:14Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:58:14.135257 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 19:58:14.135754 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 19:58:14.135785 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 19:58:14.135826 6104 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:58:14.136205 6104 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:58:14.136416 6104 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 19:58:14.136542 6104 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 19:58:14.136890 6104 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 19:58:14.136962 6104 factory.go:656] Stopping watch factory\\\\nI1202 19:58:14.136982 6104 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.519315 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.529125 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.529191 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.529208 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.529233 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.529252 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:16Z","lastTransitionTime":"2025-12-02T19:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.538850 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.564394 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.632088 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.632215 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.632235 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.632266 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.632284 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:16Z","lastTransitionTime":"2025-12-02T19:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.736209 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.736258 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.736268 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.736286 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.736296 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:16Z","lastTransitionTime":"2025-12-02T19:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.839419 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.839456 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.839466 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.839481 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.839491 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:16Z","lastTransitionTime":"2025-12-02T19:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.942499 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.942568 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.942596 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.942627 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:16 crc kubenswrapper[4807]: I1202 19:58:16.942647 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:16Z","lastTransitionTime":"2025-12-02T19:58:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.045579 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.045625 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.045633 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.045648 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.045656 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:17Z","lastTransitionTime":"2025-12-02T19:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.149149 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.149211 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.149224 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.149247 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.149260 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:17Z","lastTransitionTime":"2025-12-02T19:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.253190 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.253258 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.253271 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.253293 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.253307 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:17Z","lastTransitionTime":"2025-12-02T19:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.310638 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/1.log" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.311606 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/0.log" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.315573 4807 generic.go:334] "Generic (PLEG): container finished" podID="798a6158-a963-43b4-941e-ac4f3df2f883" containerID="45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700" exitCode=1 Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.315637 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerDied","Data":"45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700"} Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.315710 4807 scope.go:117] "RemoveContainer" containerID="e267813d3499d2bc0c176853689b0f8647c336ed669b096fccd26beeb63c444c" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.316401 4807 scope.go:117] "RemoveContainer" containerID="45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700" Dec 02 19:58:17 crc kubenswrapper[4807]: E1202 19:58:17.316609 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.333447 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:17Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.347848 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:17Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.356824 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.356891 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.356911 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.356937 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.356958 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:17Z","lastTransitionTime":"2025-12-02T19:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.361584 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:17Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.379515 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:17Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.397498 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:17Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.418984 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e267813d3499d2bc0c176853689b0f8647c336ed669b096fccd26beeb63c444c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:14Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:58:14.135257 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 19:58:14.135754 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 19:58:14.135785 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 19:58:14.135826 6104 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:58:14.136205 6104 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:58:14.136416 6104 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 19:58:14.136542 6104 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 19:58:14.136890 6104 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 19:58:14.136962 6104 factory.go:656] Stopping watch factory\\\\nI1202 19:58:14.136982 6104 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:16Z\\\",\\\"message\\\":\\\"51 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1202 19:58:16.083988 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1202 19:58:16.084011 6251 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1202 19:58:16.084019 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z]\\\\nI1202 19:58:16.084025 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:17Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.433483 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:17Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.446824 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:17Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.458794 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.458837 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.458858 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.458879 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.458892 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:17Z","lastTransitionTime":"2025-12-02T19:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.467926 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:17Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.480296 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:17Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.493697 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:17Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.507006 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:17Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.523430 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:17Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.542968 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:17Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.560767 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.560820 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.560829 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.560847 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.560861 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:17Z","lastTransitionTime":"2025-12-02T19:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.663228 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.663301 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.663325 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.663353 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.663375 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:17Z","lastTransitionTime":"2025-12-02T19:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.767410 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.767479 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.767497 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.767563 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.767585 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:17Z","lastTransitionTime":"2025-12-02T19:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.871214 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.871248 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.871257 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.871273 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.871284 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:17Z","lastTransitionTime":"2025-12-02T19:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.970975 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp"] Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.972208 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.972335 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.972224 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" Dec 02 19:58:17 crc kubenswrapper[4807]: E1202 19:58:17.972582 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:17 crc kubenswrapper[4807]: E1202 19:58:17.972576 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.973194 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:17 crc kubenswrapper[4807]: E1202 19:58:17.973377 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.975651 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.975709 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.975758 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.975781 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.975798 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:17Z","lastTransitionTime":"2025-12-02T19:58:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.976119 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.978012 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 02 19:58:17 crc kubenswrapper[4807]: I1202 19:58:17.998042 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:17Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.014993 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.031401 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.045664 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.058945 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.076801 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.078928 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.078989 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.079005 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.079029 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.079046 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:18Z","lastTransitionTime":"2025-12-02T19:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.085263 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af96502b-9dc2-4b45-8099-20f6fd28df4c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ht6kp\" (UID: \"af96502b-9dc2-4b45-8099-20f6fd28df4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.085345 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af96502b-9dc2-4b45-8099-20f6fd28df4c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ht6kp\" (UID: \"af96502b-9dc2-4b45-8099-20f6fd28df4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.085458 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzfn4\" (UniqueName: \"kubernetes.io/projected/af96502b-9dc2-4b45-8099-20f6fd28df4c-kube-api-access-nzfn4\") pod \"ovnkube-control-plane-749d76644c-ht6kp\" (UID: \"af96502b-9dc2-4b45-8099-20f6fd28df4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.085512 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af96502b-9dc2-4b45-8099-20f6fd28df4c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ht6kp\" (UID: \"af96502b-9dc2-4b45-8099-20f6fd28df4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.090887 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.105965 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.119145 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.135649 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.152790 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.167907 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.181338 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.181393 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.181405 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.181424 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.181435 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:18Z","lastTransitionTime":"2025-12-02T19:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.185310 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.186705 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af96502b-9dc2-4b45-8099-20f6fd28df4c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ht6kp\" (UID: \"af96502b-9dc2-4b45-8099-20f6fd28df4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.186773 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af96502b-9dc2-4b45-8099-20f6fd28df4c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ht6kp\" (UID: \"af96502b-9dc2-4b45-8099-20f6fd28df4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.186824 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzfn4\" (UniqueName: \"kubernetes.io/projected/af96502b-9dc2-4b45-8099-20f6fd28df4c-kube-api-access-nzfn4\") pod \"ovnkube-control-plane-749d76644c-ht6kp\" (UID: \"af96502b-9dc2-4b45-8099-20f6fd28df4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.186845 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af96502b-9dc2-4b45-8099-20f6fd28df4c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ht6kp\" (UID: \"af96502b-9dc2-4b45-8099-20f6fd28df4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.187556 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af96502b-9dc2-4b45-8099-20f6fd28df4c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-ht6kp\" (UID: \"af96502b-9dc2-4b45-8099-20f6fd28df4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.187575 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af96502b-9dc2-4b45-8099-20f6fd28df4c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-ht6kp\" (UID: \"af96502b-9dc2-4b45-8099-20f6fd28df4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.196554 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af96502b-9dc2-4b45-8099-20f6fd28df4c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-ht6kp\" (UID: \"af96502b-9dc2-4b45-8099-20f6fd28df4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.204141 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzfn4\" (UniqueName: \"kubernetes.io/projected/af96502b-9dc2-4b45-8099-20f6fd28df4c-kube-api-access-nzfn4\") pod \"ovnkube-control-plane-749d76644c-ht6kp\" (UID: \"af96502b-9dc2-4b45-8099-20f6fd28df4c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.207094 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e267813d3499d2bc0c176853689b0f8647c336ed669b096fccd26beeb63c444c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:14Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:58:14.135257 6104 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1202 19:58:14.135754 6104 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1202 19:58:14.135785 6104 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1202 19:58:14.135826 6104 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:58:14.136205 6104 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:58:14.136416 6104 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 19:58:14.136542 6104 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 19:58:14.136890 6104 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 19:58:14.136962 6104 factory.go:656] Stopping watch factory\\\\nI1202 19:58:14.136982 6104 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:16Z\\\",\\\"message\\\":\\\"51 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1202 19:58:16.083988 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1202 19:58:16.084011 6251 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1202 19:58:16.084019 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z]\\\\nI1202 19:58:16.084025 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.220514 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.278521 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.278583 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.278612 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.278642 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.278665 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:18Z","lastTransitionTime":"2025-12-02T19:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.295384 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" Dec 02 19:58:18 crc kubenswrapper[4807]: E1202 19:58:18.298115 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.302781 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.302846 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.302864 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.302892 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.302910 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:18Z","lastTransitionTime":"2025-12-02T19:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.322821 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/1.log" Dec 02 19:58:18 crc kubenswrapper[4807]: E1202 19:58:18.326677 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.331225 4807 scope.go:117] "RemoveContainer" containerID="45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700" Dec 02 19:58:18 crc kubenswrapper[4807]: E1202 19:58:18.331399 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.332054 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" event={"ID":"af96502b-9dc2-4b45-8099-20f6fd28df4c","Type":"ContainerStarted","Data":"0f6526cb3ebda63427946604b31bb8e79ed8138a53a4599d6bc95a99e4caf642"} Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.332477 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.332526 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.332536 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.332584 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.332602 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:18Z","lastTransitionTime":"2025-12-02T19:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:18 crc kubenswrapper[4807]: E1202 19:58:18.346250 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.347383 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.353969 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.354000 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.354012 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.354027 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.354039 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:18Z","lastTransitionTime":"2025-12-02T19:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:18 crc kubenswrapper[4807]: E1202 19:58:18.376323 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.379310 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.390870 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.390909 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.390920 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.390937 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.390948 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:18Z","lastTransitionTime":"2025-12-02T19:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.406851 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: E1202 19:58:18.415187 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: E1202 19:58:18.415299 4807 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.417589 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.417609 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.417618 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.417634 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.417645 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:18Z","lastTransitionTime":"2025-12-02T19:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.433408 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.462939 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.474402 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.489157 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.504423 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.518092 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.520669 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.520706 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.520741 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.520768 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.520783 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:18Z","lastTransitionTime":"2025-12-02T19:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.534978 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.548659 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.567874 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:16Z\\\",\\\"message\\\":\\\"51 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1202 19:58:16.083988 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1202 19:58:16.084011 6251 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1202 19:58:16.084019 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z]\\\\nI1202 19:58:16.084025 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.581984 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.596256 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.615515 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:18Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.624045 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.624092 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.624105 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.624130 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.624142 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:18Z","lastTransitionTime":"2025-12-02T19:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.728005 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.728113 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.728140 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.728171 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.728191 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:18Z","lastTransitionTime":"2025-12-02T19:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.831972 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.832036 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.832051 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.832073 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.832090 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:18Z","lastTransitionTime":"2025-12-02T19:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.936292 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.936373 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.936424 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.936457 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:18 crc kubenswrapper[4807]: I1202 19:58:18.936479 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:18Z","lastTransitionTime":"2025-12-02T19:58:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.039905 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.040015 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.040046 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.040073 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.040092 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:19Z","lastTransitionTime":"2025-12-02T19:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.126702 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-7z9t6"] Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.127961 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.128118 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.142785 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.142859 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.142883 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.142916 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.142942 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:19Z","lastTransitionTime":"2025-12-02T19:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.152884 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.175877 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.196709 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.202421 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs\") pod \"network-metrics-daemon-7z9t6\" (UID: \"1cb49a08-30b0-4353-ad4a-23362f281475\") " pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.202494 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdm6j\" (UniqueName: \"kubernetes.io/projected/1cb49a08-30b0-4353-ad4a-23362f281475-kube-api-access-rdm6j\") pod \"network-metrics-daemon-7z9t6\" (UID: \"1cb49a08-30b0-4353-ad4a-23362f281475\") " pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.214555 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.228449 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.246492 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.246570 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.246589 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.246640 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.246659 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:19Z","lastTransitionTime":"2025-12-02T19:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.248922 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.263149 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.285218 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.303214 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs\") pod \"network-metrics-daemon-7z9t6\" (UID: \"1cb49a08-30b0-4353-ad4a-23362f281475\") " pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.303314 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdm6j\" (UniqueName: \"kubernetes.io/projected/1cb49a08-30b0-4353-ad4a-23362f281475-kube-api-access-rdm6j\") pod \"network-metrics-daemon-7z9t6\" (UID: \"1cb49a08-30b0-4353-ad4a-23362f281475\") " pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.303489 4807 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.303630 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs podName:1cb49a08-30b0-4353-ad4a-23362f281475 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:19.803594212 +0000 UTC m=+35.104501907 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs") pod "network-metrics-daemon-7z9t6" (UID: "1cb49a08-30b0-4353-ad4a-23362f281475") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.307570 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.323708 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb49a08-30b0-4353-ad4a-23362f281475\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7z9t6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.334019 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdm6j\" (UniqueName: \"kubernetes.io/projected/1cb49a08-30b0-4353-ad4a-23362f281475-kube-api-access-rdm6j\") pod \"network-metrics-daemon-7z9t6\" (UID: \"1cb49a08-30b0-4353-ad4a-23362f281475\") " pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.349347 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.349419 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.349431 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.349448 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.349460 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:19Z","lastTransitionTime":"2025-12-02T19:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.349619 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.368561 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.387332 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.417583 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:16Z\\\",\\\"message\\\":\\\"51 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1202 19:58:16.083988 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1202 19:58:16.084011 6251 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1202 19:58:16.084019 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z]\\\\nI1202 19:58:16.084025 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.438403 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.452457 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.452512 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.452522 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.452541 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.452552 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:19Z","lastTransitionTime":"2025-12-02T19:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.455099 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:19Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.555526 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.555584 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.555597 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.555618 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.555631 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:19Z","lastTransitionTime":"2025-12-02T19:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.658636 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.658691 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.658704 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.658755 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.658773 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:19Z","lastTransitionTime":"2025-12-02T19:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.761770 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.761819 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.761828 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.761845 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.761856 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:19Z","lastTransitionTime":"2025-12-02T19:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.808998 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.809135 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs\") pod \"network-metrics-daemon-7z9t6\" (UID: \"1cb49a08-30b0-4353-ad4a-23362f281475\") " pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.809174 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.809221 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.809270 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 19:58:35.809224578 +0000 UTC m=+51.110132083 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.809348 4807 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.809355 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.809382 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.809395 4807 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.809407 4807 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.809401 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.809477 4807 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.809421 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:35.809400133 +0000 UTC m=+51.110307858 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.809542 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:35.809529586 +0000 UTC m=+51.110437101 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.809558 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs podName:1cb49a08-30b0-4353-ad4a-23362f281475 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:20.809550417 +0000 UTC m=+36.110457922 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs") pod "network-metrics-daemon-7z9t6" (UID: "1cb49a08-30b0-4353-ad4a-23362f281475") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.809573 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:35.809567317 +0000 UTC m=+51.110474822 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.864792 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.864838 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.864854 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.864875 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.864890 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:19Z","lastTransitionTime":"2025-12-02T19:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.911043 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.911377 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.911446 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.911478 4807 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.911610 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:35.911575408 +0000 UTC m=+51.212482943 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.968131 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.968196 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.968214 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.968241 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.968263 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:19Z","lastTransitionTime":"2025-12-02T19:58:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.971566 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.971628 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:19 crc kubenswrapper[4807]: I1202 19:58:19.971592 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.971825 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.972139 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:19 crc kubenswrapper[4807]: E1202 19:58:19.972251 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.070524 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.070568 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.070582 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.070602 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.070617 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:20Z","lastTransitionTime":"2025-12-02T19:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.174129 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.174174 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.174182 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.174198 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.174210 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:20Z","lastTransitionTime":"2025-12-02T19:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.281786 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.282130 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.282244 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.282338 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.282540 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:20Z","lastTransitionTime":"2025-12-02T19:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.342821 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" event={"ID":"af96502b-9dc2-4b45-8099-20f6fd28df4c","Type":"ContainerStarted","Data":"c46fe6ed5b39d7061be6cc035f987b35f4c2e8efd8c958942903cd2ccd21ae4a"} Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.386292 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.386334 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.386345 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.386363 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.386374 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:20Z","lastTransitionTime":"2025-12-02T19:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.489056 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.490369 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.490512 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.490653 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.490828 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:20Z","lastTransitionTime":"2025-12-02T19:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.593977 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.594037 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.594050 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.594070 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.594087 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:20Z","lastTransitionTime":"2025-12-02T19:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.696832 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.696896 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.696911 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.696931 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.696943 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:20Z","lastTransitionTime":"2025-12-02T19:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.799880 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.799950 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.799973 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.800001 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.800023 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:20Z","lastTransitionTime":"2025-12-02T19:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.820583 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs\") pod \"network-metrics-daemon-7z9t6\" (UID: \"1cb49a08-30b0-4353-ad4a-23362f281475\") " pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:20 crc kubenswrapper[4807]: E1202 19:58:20.820748 4807 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 19:58:20 crc kubenswrapper[4807]: E1202 19:58:20.820996 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs podName:1cb49a08-30b0-4353-ad4a-23362f281475 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:22.820981557 +0000 UTC m=+38.121889052 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs") pod "network-metrics-daemon-7z9t6" (UID: "1cb49a08-30b0-4353-ad4a-23362f281475") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.902549 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.902634 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.902665 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.902693 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.902754 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:20Z","lastTransitionTime":"2025-12-02T19:58:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:20 crc kubenswrapper[4807]: I1202 19:58:20.972088 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:20 crc kubenswrapper[4807]: E1202 19:58:20.972282 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.005144 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.005214 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.005229 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.005250 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.005263 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:21Z","lastTransitionTime":"2025-12-02T19:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.107855 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.108700 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.108881 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.109014 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.109194 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:21Z","lastTransitionTime":"2025-12-02T19:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.213051 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.213109 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.213127 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.213147 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.213158 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:21Z","lastTransitionTime":"2025-12-02T19:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.315608 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.315671 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.315688 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.315744 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.315764 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:21Z","lastTransitionTime":"2025-12-02T19:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.347540 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" event={"ID":"af96502b-9dc2-4b45-8099-20f6fd28df4c","Type":"ContainerStarted","Data":"ecaa8988ab1f5c2702147a6aee69a4ccdc28be18c5ec72fc0c0a3fc4c9d319cc"} Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.360903 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.375386 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.390641 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.414606 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.418734 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.418773 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.418786 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.418802 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.418813 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:21Z","lastTransitionTime":"2025-12-02T19:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.442964 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.457790 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.474507 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.489007 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.502614 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46fe6ed5b39d7061be6cc035f987b35f4c2e8efd8c958942903cd2ccd21ae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecaa8988ab1f5c2702147a6aee69a4ccdc28be18c5ec72fc0c0a3fc4c9d319cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.513578 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb49a08-30b0-4353-ad4a-23362f281475\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7z9t6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.521387 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.521459 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.521474 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.521496 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.521511 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:21Z","lastTransitionTime":"2025-12-02T19:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.528894 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.548535 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.565523 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.582941 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.606492 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:16Z\\\",\\\"message\\\":\\\"51 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1202 19:58:16.083988 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1202 19:58:16.084011 6251 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1202 19:58:16.084019 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z]\\\\nI1202 19:58:16.084025 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.618752 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:21Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.624656 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.624695 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.624709 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.624743 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.624758 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:21Z","lastTransitionTime":"2025-12-02T19:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.727179 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.727266 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.727300 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.727329 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.727350 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:21Z","lastTransitionTime":"2025-12-02T19:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.836580 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.836686 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.836743 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.837006 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.837038 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:21Z","lastTransitionTime":"2025-12-02T19:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.940609 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.940679 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.940691 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.940710 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.940736 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:21Z","lastTransitionTime":"2025-12-02T19:58:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.972006 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.972068 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:21 crc kubenswrapper[4807]: I1202 19:58:21.972097 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:21 crc kubenswrapper[4807]: E1202 19:58:21.972188 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:21 crc kubenswrapper[4807]: E1202 19:58:21.972317 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:21 crc kubenswrapper[4807]: E1202 19:58:21.972418 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.043173 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.043208 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.043216 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.043229 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.043239 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:22Z","lastTransitionTime":"2025-12-02T19:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.146002 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.146051 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.146064 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.146081 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.146095 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:22Z","lastTransitionTime":"2025-12-02T19:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.249278 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.249321 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.249337 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.249359 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.249371 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:22Z","lastTransitionTime":"2025-12-02T19:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.350965 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.351023 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.351033 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.351049 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.351060 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:22Z","lastTransitionTime":"2025-12-02T19:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.453340 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.453432 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.453456 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.453485 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.453507 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:22Z","lastTransitionTime":"2025-12-02T19:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.556329 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.556374 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.556385 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.556400 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.556412 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:22Z","lastTransitionTime":"2025-12-02T19:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.659135 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.659183 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.659197 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.659214 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.659226 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:22Z","lastTransitionTime":"2025-12-02T19:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.762082 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.762135 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.762146 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.762167 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.762180 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:22Z","lastTransitionTime":"2025-12-02T19:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.846147 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs\") pod \"network-metrics-daemon-7z9t6\" (UID: \"1cb49a08-30b0-4353-ad4a-23362f281475\") " pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:22 crc kubenswrapper[4807]: E1202 19:58:22.846412 4807 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 19:58:22 crc kubenswrapper[4807]: E1202 19:58:22.846575 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs podName:1cb49a08-30b0-4353-ad4a-23362f281475 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:26.846537649 +0000 UTC m=+42.147445174 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs") pod "network-metrics-daemon-7z9t6" (UID: "1cb49a08-30b0-4353-ad4a-23362f281475") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.864586 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.864675 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.864699 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.864770 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.864804 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:22Z","lastTransitionTime":"2025-12-02T19:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.968385 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.968437 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.968449 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.968465 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.968475 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:22Z","lastTransitionTime":"2025-12-02T19:58:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:22 crc kubenswrapper[4807]: I1202 19:58:22.971850 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:22 crc kubenswrapper[4807]: E1202 19:58:22.972163 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.070800 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.070838 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.070848 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.070863 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.070876 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:23Z","lastTransitionTime":"2025-12-02T19:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.173594 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.173644 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.173657 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.173674 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.173688 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:23Z","lastTransitionTime":"2025-12-02T19:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.277221 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.277289 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.277336 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.277361 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.277378 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:23Z","lastTransitionTime":"2025-12-02T19:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.380440 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.380510 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.380529 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.380553 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.380573 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:23Z","lastTransitionTime":"2025-12-02T19:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.483346 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.483388 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.483397 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.483409 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.483418 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:23Z","lastTransitionTime":"2025-12-02T19:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.586058 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.586100 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.586170 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.586188 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.586202 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:23Z","lastTransitionTime":"2025-12-02T19:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.689597 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.689669 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.689693 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.689752 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.689772 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:23Z","lastTransitionTime":"2025-12-02T19:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.793563 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.793655 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.793678 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.793704 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.793752 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:23Z","lastTransitionTime":"2025-12-02T19:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.897221 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.897301 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.897331 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.897361 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.897380 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:23Z","lastTransitionTime":"2025-12-02T19:58:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.972238 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.972242 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:23 crc kubenswrapper[4807]: E1202 19:58:23.972429 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:23 crc kubenswrapper[4807]: I1202 19:58:23.972252 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:23 crc kubenswrapper[4807]: E1202 19:58:23.972522 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:23 crc kubenswrapper[4807]: E1202 19:58:23.972824 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.000291 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.000369 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.000386 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.000401 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.000413 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:24Z","lastTransitionTime":"2025-12-02T19:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.103640 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.103759 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.103787 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.103821 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.103845 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:24Z","lastTransitionTime":"2025-12-02T19:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.207199 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.207264 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.207283 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.207309 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.207326 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:24Z","lastTransitionTime":"2025-12-02T19:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.310472 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.310547 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.310566 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.310594 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.310613 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:24Z","lastTransitionTime":"2025-12-02T19:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.414357 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.414418 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.414434 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.414456 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.414473 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:24Z","lastTransitionTime":"2025-12-02T19:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.517436 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.517511 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.517529 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.517560 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.517580 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:24Z","lastTransitionTime":"2025-12-02T19:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.621068 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.621165 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.621193 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.621226 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.621251 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:24Z","lastTransitionTime":"2025-12-02T19:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.724780 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.724862 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.724886 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.724917 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.724943 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:24Z","lastTransitionTime":"2025-12-02T19:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.828001 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.828050 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.828063 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.828081 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.828094 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:24Z","lastTransitionTime":"2025-12-02T19:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.931574 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.931655 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.931673 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.931698 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.931734 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:24Z","lastTransitionTime":"2025-12-02T19:58:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.972539 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:24 crc kubenswrapper[4807]: E1202 19:58:24.972712 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:24 crc kubenswrapper[4807]: I1202 19:58:24.995374 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:24Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.018019 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.034903 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.034958 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.034975 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.034998 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.035016 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:25Z","lastTransitionTime":"2025-12-02T19:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.049416 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:16Z\\\",\\\"message\\\":\\\"51 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1202 19:58:16.083988 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1202 19:58:16.084011 6251 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1202 19:58:16.084019 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z]\\\\nI1202 19:58:16.084025 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.069436 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.085707 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.110530 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.126397 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.138326 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.138397 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.138422 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.138454 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.138484 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:25Z","lastTransitionTime":"2025-12-02T19:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.150872 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.165377 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.180245 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.192761 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.210167 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.221436 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46fe6ed5b39d7061be6cc035f987b35f4c2e8efd8c958942903cd2ccd21ae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecaa8988ab1f5c2702147a6aee69a4ccdc28be18c5ec72fc0c0a3fc4c9d319cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.232701 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb49a08-30b0-4353-ad4a-23362f281475\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7z9t6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.241506 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.241559 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.241571 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.241591 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.241605 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:25Z","lastTransitionTime":"2025-12-02T19:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.247743 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.267850 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:25Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.344366 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.344393 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.344402 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.344414 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.344423 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:25Z","lastTransitionTime":"2025-12-02T19:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.447754 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.448152 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.448384 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.448559 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.448701 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:25Z","lastTransitionTime":"2025-12-02T19:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.551728 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.551773 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.551786 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.551800 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.551811 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:25Z","lastTransitionTime":"2025-12-02T19:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.654438 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.654498 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.654507 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.654527 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.654538 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:25Z","lastTransitionTime":"2025-12-02T19:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.758251 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.758329 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.758344 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.758364 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.758377 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:25Z","lastTransitionTime":"2025-12-02T19:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.860969 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.861014 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.861026 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.861042 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.861055 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:25Z","lastTransitionTime":"2025-12-02T19:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.964138 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.964219 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.964241 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.964270 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.964291 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:25Z","lastTransitionTime":"2025-12-02T19:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.971481 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.971541 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:25 crc kubenswrapper[4807]: I1202 19:58:25.971550 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:25 crc kubenswrapper[4807]: E1202 19:58:25.971797 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:25 crc kubenswrapper[4807]: E1202 19:58:25.971910 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:25 crc kubenswrapper[4807]: E1202 19:58:25.972017 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.066629 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.066689 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.066709 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.066750 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.066764 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:26Z","lastTransitionTime":"2025-12-02T19:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.170582 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.170670 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.170709 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.170774 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.170799 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:26Z","lastTransitionTime":"2025-12-02T19:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.273900 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.274053 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.274083 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.274207 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.274234 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:26Z","lastTransitionTime":"2025-12-02T19:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.376793 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.376859 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.376896 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.376928 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.376951 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:26Z","lastTransitionTime":"2025-12-02T19:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.480532 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.480594 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.480616 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.480645 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.480666 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:26Z","lastTransitionTime":"2025-12-02T19:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.584087 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.584173 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.584197 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.584222 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.584239 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:26Z","lastTransitionTime":"2025-12-02T19:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.687527 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.687605 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.687626 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.687657 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.687681 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:26Z","lastTransitionTime":"2025-12-02T19:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.790773 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.790828 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.790841 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.790857 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.790868 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:26Z","lastTransitionTime":"2025-12-02T19:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.892531 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs\") pod \"network-metrics-daemon-7z9t6\" (UID: \"1cb49a08-30b0-4353-ad4a-23362f281475\") " pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:26 crc kubenswrapper[4807]: E1202 19:58:26.892710 4807 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 19:58:26 crc kubenswrapper[4807]: E1202 19:58:26.892819 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs podName:1cb49a08-30b0-4353-ad4a-23362f281475 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:34.892805291 +0000 UTC m=+50.193712786 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs") pod "network-metrics-daemon-7z9t6" (UID: "1cb49a08-30b0-4353-ad4a-23362f281475") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.894127 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.894163 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.894174 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.894190 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.894201 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:26Z","lastTransitionTime":"2025-12-02T19:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.971554 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:26 crc kubenswrapper[4807]: E1202 19:58:26.971848 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.996878 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.996949 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.997167 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.997197 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:26 crc kubenswrapper[4807]: I1202 19:58:26.997219 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:26Z","lastTransitionTime":"2025-12-02T19:58:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.099916 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.099963 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.099975 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.099993 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.100008 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:27Z","lastTransitionTime":"2025-12-02T19:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.202637 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.202711 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.202794 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.202825 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.202844 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:27Z","lastTransitionTime":"2025-12-02T19:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.308330 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.308395 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.308408 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.308423 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.308436 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:27Z","lastTransitionTime":"2025-12-02T19:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.411913 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.412008 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.412024 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.412049 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.412063 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:27Z","lastTransitionTime":"2025-12-02T19:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.514543 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.514614 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.514632 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.514654 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.514671 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:27Z","lastTransitionTime":"2025-12-02T19:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.617750 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.617808 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.617819 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.617839 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.617853 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:27Z","lastTransitionTime":"2025-12-02T19:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.720803 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.720856 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.720868 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.720886 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.720900 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:27Z","lastTransitionTime":"2025-12-02T19:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.823543 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.823603 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.823621 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.823643 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.823660 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:27Z","lastTransitionTime":"2025-12-02T19:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.927056 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.927135 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.927154 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.927181 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.927199 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:27Z","lastTransitionTime":"2025-12-02T19:58:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.972117 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.972181 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:27 crc kubenswrapper[4807]: I1202 19:58:27.972254 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:27 crc kubenswrapper[4807]: E1202 19:58:27.972455 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:27 crc kubenswrapper[4807]: E1202 19:58:27.972609 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:27 crc kubenswrapper[4807]: E1202 19:58:27.972785 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.031007 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.031107 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.031135 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.031168 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.031193 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:28Z","lastTransitionTime":"2025-12-02T19:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.134290 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.134351 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.134360 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.134379 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.134394 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:28Z","lastTransitionTime":"2025-12-02T19:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.237628 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.237686 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.237698 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.237749 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.237764 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:28Z","lastTransitionTime":"2025-12-02T19:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.340433 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.340473 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.340482 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.340497 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.340508 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:28Z","lastTransitionTime":"2025-12-02T19:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.444235 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.444321 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.444341 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.444368 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.444390 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:28Z","lastTransitionTime":"2025-12-02T19:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.547217 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.547296 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.547320 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.547348 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.547371 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:28Z","lastTransitionTime":"2025-12-02T19:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.589022 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.589102 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.589125 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.589149 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.589170 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:28Z","lastTransitionTime":"2025-12-02T19:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:28 crc kubenswrapper[4807]: E1202 19:58:28.610874 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.615819 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.615877 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.615900 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.615929 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.615952 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:28Z","lastTransitionTime":"2025-12-02T19:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:28 crc kubenswrapper[4807]: E1202 19:58:28.637536 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.643799 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.643864 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.643874 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.643892 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.643905 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:28Z","lastTransitionTime":"2025-12-02T19:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:28 crc kubenswrapper[4807]: E1202 19:58:28.657708 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.661818 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.661858 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.661867 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.661884 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.661894 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:28Z","lastTransitionTime":"2025-12-02T19:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:28 crc kubenswrapper[4807]: E1202 19:58:28.673494 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.677561 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.677613 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.677624 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.677640 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.677650 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:28Z","lastTransitionTime":"2025-12-02T19:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:28 crc kubenswrapper[4807]: E1202 19:58:28.692410 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:28Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:28 crc kubenswrapper[4807]: E1202 19:58:28.692566 4807 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.694237 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.694275 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.694288 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.694312 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.694330 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:28Z","lastTransitionTime":"2025-12-02T19:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.797627 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.797699 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.797772 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.797805 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.797826 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:28Z","lastTransitionTime":"2025-12-02T19:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.901616 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.901679 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.901696 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.901748 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.901776 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:28Z","lastTransitionTime":"2025-12-02T19:58:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:28 crc kubenswrapper[4807]: I1202 19:58:28.972178 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:28 crc kubenswrapper[4807]: E1202 19:58:28.972477 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.004441 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.004507 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.004529 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.004557 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.004580 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:29Z","lastTransitionTime":"2025-12-02T19:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.106876 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.106950 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.106965 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.106985 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.107002 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:29Z","lastTransitionTime":"2025-12-02T19:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.219622 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.219689 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.219702 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.219728 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.219739 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:29Z","lastTransitionTime":"2025-12-02T19:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.322870 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.322928 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.322950 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.322977 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.322998 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:29Z","lastTransitionTime":"2025-12-02T19:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.426003 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.426067 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.426086 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.426111 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.426129 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:29Z","lastTransitionTime":"2025-12-02T19:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.529253 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.529305 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.529321 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.529343 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.529360 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:29Z","lastTransitionTime":"2025-12-02T19:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.632416 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.632467 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.632485 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.632510 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.632529 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:29Z","lastTransitionTime":"2025-12-02T19:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.735141 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.735176 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.735184 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.735196 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.735204 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:29Z","lastTransitionTime":"2025-12-02T19:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.838531 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.838582 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.838598 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.838620 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.838637 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:29Z","lastTransitionTime":"2025-12-02T19:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.941454 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.941507 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.941524 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.941548 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:29 crc kubenswrapper[4807]: I1202 19:58:29.941567 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:29Z","lastTransitionTime":"2025-12-02T19:58:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.044400 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.044445 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.044455 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.044472 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.044484 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:30Z","lastTransitionTime":"2025-12-02T19:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.138293 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.138475 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.138580 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:30 crc kubenswrapper[4807]: E1202 19:58:30.138819 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.138886 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:30 crc kubenswrapper[4807]: E1202 19:58:30.139063 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:30 crc kubenswrapper[4807]: E1202 19:58:30.139164 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:30 crc kubenswrapper[4807]: E1202 19:58:30.139217 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.147592 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.147651 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.147673 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.147704 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.147767 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:30Z","lastTransitionTime":"2025-12-02T19:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.251202 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.251248 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.251258 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.251274 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.251286 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:30Z","lastTransitionTime":"2025-12-02T19:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.353882 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.353945 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.353955 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.353974 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.353985 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:30Z","lastTransitionTime":"2025-12-02T19:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.456402 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.456459 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.456474 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.456494 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.456507 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:30Z","lastTransitionTime":"2025-12-02T19:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.560780 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.560884 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.560912 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.560945 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.560974 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:30Z","lastTransitionTime":"2025-12-02T19:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.663895 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.664433 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.664539 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.664629 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.664815 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:30Z","lastTransitionTime":"2025-12-02T19:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.767255 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.767526 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.767679 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.767846 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.767956 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:30Z","lastTransitionTime":"2025-12-02T19:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.871630 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.872384 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.872471 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.872552 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.872640 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:30Z","lastTransitionTime":"2025-12-02T19:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.975039 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.975112 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.975179 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.975203 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:30 crc kubenswrapper[4807]: I1202 19:58:30.975220 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:30Z","lastTransitionTime":"2025-12-02T19:58:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.078119 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.078163 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.078171 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.078184 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.078196 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:31Z","lastTransitionTime":"2025-12-02T19:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.181550 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.181619 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.181629 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.181645 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.181655 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:31Z","lastTransitionTime":"2025-12-02T19:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.284486 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.284537 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.284551 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.284572 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.284585 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:31Z","lastTransitionTime":"2025-12-02T19:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.391310 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.391366 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.391378 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.391456 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.391506 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:31Z","lastTransitionTime":"2025-12-02T19:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.494607 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.494660 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.494670 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.494685 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.494695 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:31Z","lastTransitionTime":"2025-12-02T19:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.597263 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.597297 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.597307 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.597326 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.597338 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:31Z","lastTransitionTime":"2025-12-02T19:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.699775 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.699815 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.699826 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.699861 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.699871 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:31Z","lastTransitionTime":"2025-12-02T19:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.802624 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.802706 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.802771 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.802806 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.802831 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:31Z","lastTransitionTime":"2025-12-02T19:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.905467 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.905540 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.905557 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.905978 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.906056 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:31Z","lastTransitionTime":"2025-12-02T19:58:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.971640 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.971706 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.971873 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:31 crc kubenswrapper[4807]: E1202 19:58:31.971984 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:31 crc kubenswrapper[4807]: I1202 19:58:31.972000 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:31 crc kubenswrapper[4807]: E1202 19:58:31.972147 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:31 crc kubenswrapper[4807]: E1202 19:58:31.972318 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:31 crc kubenswrapper[4807]: E1202 19:58:31.972474 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.009924 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.010020 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.010045 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.010076 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.010100 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:32Z","lastTransitionTime":"2025-12-02T19:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.113880 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.113955 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.113977 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.114006 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.114029 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:32Z","lastTransitionTime":"2025-12-02T19:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.217952 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.218018 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.218035 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.218055 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.218070 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:32Z","lastTransitionTime":"2025-12-02T19:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.321293 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.321342 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.321354 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.321373 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.321386 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:32Z","lastTransitionTime":"2025-12-02T19:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.425841 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.425921 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.425935 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.425961 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.425978 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:32Z","lastTransitionTime":"2025-12-02T19:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.529502 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.529575 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.529602 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.529635 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.529661 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:32Z","lastTransitionTime":"2025-12-02T19:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.633027 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.633115 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.633148 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.633200 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.633224 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:32Z","lastTransitionTime":"2025-12-02T19:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.736157 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.736210 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.736223 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.736242 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.736256 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:32Z","lastTransitionTime":"2025-12-02T19:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.839605 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.839647 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.839655 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.839674 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.839684 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:32Z","lastTransitionTime":"2025-12-02T19:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.942165 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.942217 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.942228 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.942245 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.942260 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:32Z","lastTransitionTime":"2025-12-02T19:58:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:32 crc kubenswrapper[4807]: I1202 19:58:32.973131 4807 scope.go:117] "RemoveContainer" containerID="45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.045296 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.045339 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.045351 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.045370 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.045382 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:33Z","lastTransitionTime":"2025-12-02T19:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.148182 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.148233 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.148247 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.148269 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.148291 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:33Z","lastTransitionTime":"2025-12-02T19:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.250678 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.250736 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.250748 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.250763 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.250773 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:33Z","lastTransitionTime":"2025-12-02T19:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.353760 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.354099 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.354121 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.354143 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.354158 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:33Z","lastTransitionTime":"2025-12-02T19:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.392992 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/1.log" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.397154 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerStarted","Data":"7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2"} Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.397660 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.413657 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.428168 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.441331 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.457622 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.457669 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.457681 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.457697 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.457710 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:33Z","lastTransitionTime":"2025-12-02T19:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.461618 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.488138 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.513828 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:16Z\\\",\\\"message\\\":\\\"51 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1202 19:58:16.083988 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1202 19:58:16.084011 6251 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1202 19:58:16.084019 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z]\\\\nI1202 19:58:16.084025 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.529179 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.543020 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.559795 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.560600 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.560636 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.560646 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.560660 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.560672 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:33Z","lastTransitionTime":"2025-12-02T19:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.571975 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.585706 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.597782 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.612039 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.626294 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.637870 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46fe6ed5b39d7061be6cc035f987b35f4c2e8efd8c958942903cd2ccd21ae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecaa8988ab1f5c2702147a6aee69a4ccdc28be18c5ec72fc0c0a3fc4c9d319cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.651455 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb49a08-30b0-4353-ad4a-23362f281475\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7z9t6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:33Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.664540 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.664603 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.664620 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.664647 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.664666 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:33Z","lastTransitionTime":"2025-12-02T19:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.767973 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.768037 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.768051 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.768072 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.768089 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:33Z","lastTransitionTime":"2025-12-02T19:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.872041 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.872097 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.872111 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.872137 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.872153 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:33Z","lastTransitionTime":"2025-12-02T19:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.971893 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.971908 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.971908 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.972082 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:33 crc kubenswrapper[4807]: E1202 19:58:33.972294 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:33 crc kubenswrapper[4807]: E1202 19:58:33.972430 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:33 crc kubenswrapper[4807]: E1202 19:58:33.972617 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:33 crc kubenswrapper[4807]: E1202 19:58:33.972739 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.974861 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.974928 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.974951 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.974978 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:33 crc kubenswrapper[4807]: I1202 19:58:33.974996 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:33Z","lastTransitionTime":"2025-12-02T19:58:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.078404 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.078463 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.078507 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.078532 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.078550 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:34Z","lastTransitionTime":"2025-12-02T19:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.180882 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.180954 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.180982 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.181016 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.181041 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:34Z","lastTransitionTime":"2025-12-02T19:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.284569 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.284616 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.284627 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.284642 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.284651 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:34Z","lastTransitionTime":"2025-12-02T19:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.387320 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.387370 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.387378 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.387396 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.387406 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:34Z","lastTransitionTime":"2025-12-02T19:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.489796 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.489844 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.489855 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.489875 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.489890 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:34Z","lastTransitionTime":"2025-12-02T19:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.593194 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.593247 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.593258 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.593280 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.593293 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:34Z","lastTransitionTime":"2025-12-02T19:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.695855 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.695906 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.695926 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.695952 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.695971 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:34Z","lastTransitionTime":"2025-12-02T19:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.799531 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.799584 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.799601 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.799628 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.799642 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:34Z","lastTransitionTime":"2025-12-02T19:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.903765 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.903883 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.903902 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.903932 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.903956 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:34Z","lastTransitionTime":"2025-12-02T19:58:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.983799 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs\") pod \"network-metrics-daemon-7z9t6\" (UID: \"1cb49a08-30b0-4353-ad4a-23362f281475\") " pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:34 crc kubenswrapper[4807]: E1202 19:58:34.984328 4807 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 19:58:34 crc kubenswrapper[4807]: E1202 19:58:34.984529 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs podName:1cb49a08-30b0-4353-ad4a-23362f281475 nodeName:}" failed. No retries permitted until 2025-12-02 19:58:50.984468218 +0000 UTC m=+66.285375703 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs") pod "network-metrics-daemon-7z9t6" (UID: "1cb49a08-30b0-4353-ad4a-23362f281475") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 19:58:34 crc kubenswrapper[4807]: I1202 19:58:34.995107 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:34Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.007271 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.007345 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.007365 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.007397 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.007416 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:35Z","lastTransitionTime":"2025-12-02T19:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.014579 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.027493 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.045034 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.069148 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.096433 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.110702 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.110765 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.110778 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.110820 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.110833 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:35Z","lastTransitionTime":"2025-12-02T19:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.120700 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.135465 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.147505 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46fe6ed5b39d7061be6cc035f987b35f4c2e8efd8c958942903cd2ccd21ae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecaa8988ab1f5c2702147a6aee69a4ccdc28be18c5ec72fc0c0a3fc4c9d319cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.163454 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb49a08-30b0-4353-ad4a-23362f281475\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7z9t6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.179674 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.195894 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.213913 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.213993 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.214018 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.214046 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.213856 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.214066 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:35Z","lastTransitionTime":"2025-12-02T19:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.231871 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.256867 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:16Z\\\",\\\"message\\\":\\\"51 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1202 19:58:16.083988 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1202 19:58:16.084011 6251 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1202 19:58:16.084019 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z]\\\\nI1202 19:58:16.084025 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.271429 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.316307 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.316363 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.316375 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.316402 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.316416 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:35Z","lastTransitionTime":"2025-12-02T19:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.405685 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/2.log" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.406474 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/1.log" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.409938 4807 generic.go:334] "Generic (PLEG): container finished" podID="798a6158-a963-43b4-941e-ac4f3df2f883" containerID="7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2" exitCode=1 Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.409986 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerDied","Data":"7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2"} Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.410094 4807 scope.go:117] "RemoveContainer" containerID="45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.410949 4807 scope.go:117] "RemoveContainer" containerID="7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2" Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.411150 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.419075 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.419122 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.419132 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.419152 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.419164 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:35Z","lastTransitionTime":"2025-12-02T19:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.423967 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.437930 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.455816 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.472276 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.484904 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.498388 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.511754 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46fe6ed5b39d7061be6cc035f987b35f4c2e8efd8c958942903cd2ccd21ae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecaa8988ab1f5c2702147a6aee69a4ccdc28be18c5ec72fc0c0a3fc4c9d319cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.522184 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.522262 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.522277 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.522297 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.522312 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:35Z","lastTransitionTime":"2025-12-02T19:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.525481 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb49a08-30b0-4353-ad4a-23362f281475\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7z9t6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.544619 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.560469 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.576414 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.591466 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.610530 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:16Z\\\",\\\"message\\\":\\\"51 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1202 19:58:16.083988 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1202 19:58:16.084011 6251 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1202 19:58:16.084019 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z]\\\\nI1202 19:58:16.084025 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:34Z\\\",\\\"message\\\":\\\", AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 19:58:33.879388 6461 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start defa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.624971 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.625452 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.625503 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.625519 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.625545 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.625561 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:35Z","lastTransitionTime":"2025-12-02T19:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.638623 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.656224 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:35Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.728933 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.728975 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.728985 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.729001 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.729012 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:35Z","lastTransitionTime":"2025-12-02T19:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.832027 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.832076 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.832088 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.832106 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.832120 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:35Z","lastTransitionTime":"2025-12-02T19:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.894165 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.894330 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.894443 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 19:59:07.894393261 +0000 UTC m=+83.195300756 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.894494 4807 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.894612 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.894653 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 19:59:07.894634117 +0000 UTC m=+83.195541612 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.894680 4807 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.894774 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 19:59:07.89475951 +0000 UTC m=+83.195667015 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.894799 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.895034 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.895084 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.895098 4807 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.895169 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 19:59:07.89514695 +0000 UTC m=+83.196054445 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.935207 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.935261 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.935280 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.935299 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.935311 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:35Z","lastTransitionTime":"2025-12-02T19:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.971635 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.971688 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.971858 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.971903 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.971904 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.972252 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.972526 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.972611 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:35 crc kubenswrapper[4807]: I1202 19:58:35.995825 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.996175 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.996236 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.996252 4807 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:35 crc kubenswrapper[4807]: E1202 19:58:35.996338 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 19:59:07.996316299 +0000 UTC m=+83.297223974 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.037705 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.037816 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.037834 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.037864 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.037884 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:36Z","lastTransitionTime":"2025-12-02T19:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.140902 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.140957 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.140973 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.141016 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.141035 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:36Z","lastTransitionTime":"2025-12-02T19:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.244733 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.244769 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.244778 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.244792 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.244802 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:36Z","lastTransitionTime":"2025-12-02T19:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.347410 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.347463 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.347478 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.347494 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.347506 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:36Z","lastTransitionTime":"2025-12-02T19:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.416531 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/2.log" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.450224 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.450294 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.450313 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.450331 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.450343 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:36Z","lastTransitionTime":"2025-12-02T19:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.553750 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.553818 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.553837 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.553865 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.553888 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:36Z","lastTransitionTime":"2025-12-02T19:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.629071 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.641563 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.650687 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:36Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.656760 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.656809 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.656819 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.656832 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.656842 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:36Z","lastTransitionTime":"2025-12-02T19:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.662794 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:36Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.676498 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:36Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.688764 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:36Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.701826 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:36Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.715685 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:36Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.728263 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46fe6ed5b39d7061be6cc035f987b35f4c2e8efd8c958942903cd2ccd21ae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecaa8988ab1f5c2702147a6aee69a4ccdc28be18c5ec72fc0c0a3fc4c9d319cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:36Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.739919 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb49a08-30b0-4353-ad4a-23362f281475\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7z9t6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:36Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.755925 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:36Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.760144 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.760189 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.760198 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.760213 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.760226 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:36Z","lastTransitionTime":"2025-12-02T19:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.768956 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:36Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.785256 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:36Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.805289 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:36Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.827074 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:16Z\\\",\\\"message\\\":\\\"51 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1202 19:58:16.083988 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1202 19:58:16.084011 6251 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1202 19:58:16.084019 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z]\\\\nI1202 19:58:16.084025 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:34Z\\\",\\\"message\\\":\\\", AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 19:58:33.879388 6461 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start defa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:36Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.840515 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:36Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.855771 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:36Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.862510 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.862547 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.862556 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.862574 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.862585 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:36Z","lastTransitionTime":"2025-12-02T19:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.871596 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:36Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.965675 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.965743 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.965757 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.965773 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:36 crc kubenswrapper[4807]: I1202 19:58:36.965783 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:36Z","lastTransitionTime":"2025-12-02T19:58:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.069274 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.069352 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.069376 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.069414 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.069440 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:37Z","lastTransitionTime":"2025-12-02T19:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.173760 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.173839 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.173857 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.173883 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.173900 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:37Z","lastTransitionTime":"2025-12-02T19:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.278041 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.278125 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.278149 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.278183 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.278210 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:37Z","lastTransitionTime":"2025-12-02T19:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.381394 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.381447 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.381456 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.381472 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.381484 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:37Z","lastTransitionTime":"2025-12-02T19:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.485521 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.485585 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.485607 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.485638 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.485658 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:37Z","lastTransitionTime":"2025-12-02T19:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.588342 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.588417 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.588441 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.588471 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.588489 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:37Z","lastTransitionTime":"2025-12-02T19:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.692313 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.692384 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.692397 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.692418 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.692435 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:37Z","lastTransitionTime":"2025-12-02T19:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.795667 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.795759 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.795778 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.795805 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.795822 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:37Z","lastTransitionTime":"2025-12-02T19:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.898963 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.899015 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.899033 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.899057 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.899073 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:37Z","lastTransitionTime":"2025-12-02T19:58:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.972430 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.972536 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.972546 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:37 crc kubenswrapper[4807]: I1202 19:58:37.972575 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:37 crc kubenswrapper[4807]: E1202 19:58:37.972820 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:37 crc kubenswrapper[4807]: E1202 19:58:37.972918 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:37 crc kubenswrapper[4807]: E1202 19:58:37.973043 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:37 crc kubenswrapper[4807]: E1202 19:58:37.973205 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.002163 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.002233 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.002252 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.002278 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.002298 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:38Z","lastTransitionTime":"2025-12-02T19:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.106348 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.106433 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.106461 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.106494 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.106519 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:38Z","lastTransitionTime":"2025-12-02T19:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.208800 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.208884 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.208901 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.208923 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.208938 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:38Z","lastTransitionTime":"2025-12-02T19:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.312063 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.312120 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.312133 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.312154 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.312170 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:38Z","lastTransitionTime":"2025-12-02T19:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.415441 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.415518 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.415538 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.415560 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.415574 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:38Z","lastTransitionTime":"2025-12-02T19:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.517935 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.517992 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.518005 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.518030 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.518044 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:38Z","lastTransitionTime":"2025-12-02T19:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.620908 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.620968 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.620985 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.621009 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.621025 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:38Z","lastTransitionTime":"2025-12-02T19:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.724055 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.724149 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.724174 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.724278 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.724306 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:38Z","lastTransitionTime":"2025-12-02T19:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.828534 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.828623 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.828643 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.828670 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.828691 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:38Z","lastTransitionTime":"2025-12-02T19:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.849763 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.849820 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.849834 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.849854 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.849873 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:38Z","lastTransitionTime":"2025-12-02T19:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:38 crc kubenswrapper[4807]: E1202 19:58:38.866130 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:38Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.871426 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.871485 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.871509 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.871536 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.871555 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:38Z","lastTransitionTime":"2025-12-02T19:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:38 crc kubenswrapper[4807]: E1202 19:58:38.891999 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:38Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.897237 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.897358 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.897386 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.897421 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.897445 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:38Z","lastTransitionTime":"2025-12-02T19:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:38 crc kubenswrapper[4807]: E1202 19:58:38.919294 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:38Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.923171 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.923222 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.923240 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.923264 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.923282 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:38Z","lastTransitionTime":"2025-12-02T19:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:38 crc kubenswrapper[4807]: E1202 19:58:38.938293 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:38Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.943633 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.943701 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.943756 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.943792 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.943816 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:38Z","lastTransitionTime":"2025-12-02T19:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:38 crc kubenswrapper[4807]: E1202 19:58:38.960061 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:38Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:38 crc kubenswrapper[4807]: E1202 19:58:38.960174 4807 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.961836 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.961881 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.961897 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.961919 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:38 crc kubenswrapper[4807]: I1202 19:58:38.961936 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:38Z","lastTransitionTime":"2025-12-02T19:58:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.066044 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.066111 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.066127 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.066152 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.066168 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:39Z","lastTransitionTime":"2025-12-02T19:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.169590 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.169637 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.169647 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.169666 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.169676 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:39Z","lastTransitionTime":"2025-12-02T19:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.272867 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.272948 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.272962 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.272985 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.273005 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:39Z","lastTransitionTime":"2025-12-02T19:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.376676 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.376767 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.376781 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.376806 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.376824 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:39Z","lastTransitionTime":"2025-12-02T19:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.480163 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.480240 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.480256 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.480282 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.480295 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:39Z","lastTransitionTime":"2025-12-02T19:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.583577 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.583626 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.583639 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.583662 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.583677 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:39Z","lastTransitionTime":"2025-12-02T19:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.686622 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.686678 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.686695 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.686747 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.686770 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:39Z","lastTransitionTime":"2025-12-02T19:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.789790 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.789842 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.789856 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.789875 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.789887 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:39Z","lastTransitionTime":"2025-12-02T19:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.893519 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.893940 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.893952 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.893969 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.893980 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:39Z","lastTransitionTime":"2025-12-02T19:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.971661 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.971697 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.971848 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:39 crc kubenswrapper[4807]: E1202 19:58:39.972008 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.972028 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:39 crc kubenswrapper[4807]: E1202 19:58:39.972246 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:39 crc kubenswrapper[4807]: E1202 19:58:39.972305 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:39 crc kubenswrapper[4807]: E1202 19:58:39.972607 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.997060 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.997119 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.997138 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.997164 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:39 crc kubenswrapper[4807]: I1202 19:58:39.997185 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:39Z","lastTransitionTime":"2025-12-02T19:58:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.103969 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.104009 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.104016 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.104031 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.104041 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:40Z","lastTransitionTime":"2025-12-02T19:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.207553 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.207618 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.207635 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.207659 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.207684 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:40Z","lastTransitionTime":"2025-12-02T19:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.310891 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.310968 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.310986 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.311013 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.311032 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:40Z","lastTransitionTime":"2025-12-02T19:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.415029 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.415107 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.415129 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.415157 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.415175 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:40Z","lastTransitionTime":"2025-12-02T19:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.518219 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.518269 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.518281 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.518299 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.518313 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:40Z","lastTransitionTime":"2025-12-02T19:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.621400 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.621480 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.621498 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.621525 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.621543 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:40Z","lastTransitionTime":"2025-12-02T19:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.725038 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.725090 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.725100 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.725121 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.725133 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:40Z","lastTransitionTime":"2025-12-02T19:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.827510 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.827589 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.827615 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.827646 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.827665 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:40Z","lastTransitionTime":"2025-12-02T19:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.931838 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.931908 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.931919 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.931941 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:40 crc kubenswrapper[4807]: I1202 19:58:40.931955 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:40Z","lastTransitionTime":"2025-12-02T19:58:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.035748 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.035812 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.035845 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.035864 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.035876 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:41Z","lastTransitionTime":"2025-12-02T19:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.138499 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.138561 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.138573 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.138587 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.138597 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:41Z","lastTransitionTime":"2025-12-02T19:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.241062 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.241136 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.241147 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.241170 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.241190 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:41Z","lastTransitionTime":"2025-12-02T19:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.345558 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.345615 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.345623 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.345642 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.345653 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:41Z","lastTransitionTime":"2025-12-02T19:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.448653 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.448749 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.448764 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.448785 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.448803 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:41Z","lastTransitionTime":"2025-12-02T19:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.552463 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.552754 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.552773 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.552793 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.552809 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:41Z","lastTransitionTime":"2025-12-02T19:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.655404 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.655540 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.655566 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.655598 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.655618 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:41Z","lastTransitionTime":"2025-12-02T19:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.758874 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.758920 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.758931 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.758955 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.758968 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:41Z","lastTransitionTime":"2025-12-02T19:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.861621 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.861676 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.861687 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.861710 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.861741 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:41Z","lastTransitionTime":"2025-12-02T19:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.964576 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.964641 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.964675 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.964696 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.964708 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:41Z","lastTransitionTime":"2025-12-02T19:58:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.972321 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.972397 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.972353 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:41 crc kubenswrapper[4807]: I1202 19:58:41.972353 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:41 crc kubenswrapper[4807]: E1202 19:58:41.972623 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:41 crc kubenswrapper[4807]: E1202 19:58:41.972863 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:41 crc kubenswrapper[4807]: E1202 19:58:41.973011 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:41 crc kubenswrapper[4807]: E1202 19:58:41.973122 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.067871 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.067938 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.067951 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.067975 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.067995 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:42Z","lastTransitionTime":"2025-12-02T19:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.171527 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.171584 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.171601 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.171628 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.171672 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:42Z","lastTransitionTime":"2025-12-02T19:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.274769 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.274844 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.274864 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.274896 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.274918 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:42Z","lastTransitionTime":"2025-12-02T19:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.378402 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.378466 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.378483 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.378505 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.378523 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:42Z","lastTransitionTime":"2025-12-02T19:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.482342 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.482408 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.482421 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.482444 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.482461 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:42Z","lastTransitionTime":"2025-12-02T19:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.584770 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.584821 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.584835 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.584931 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.584946 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:42Z","lastTransitionTime":"2025-12-02T19:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.688045 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.688121 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.688143 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.688171 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.688191 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:42Z","lastTransitionTime":"2025-12-02T19:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.791279 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.791365 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.791388 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.791429 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.791453 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:42Z","lastTransitionTime":"2025-12-02T19:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.898046 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.898137 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.898171 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.898191 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:42 crc kubenswrapper[4807]: I1202 19:58:42.898205 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:42Z","lastTransitionTime":"2025-12-02T19:58:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.001958 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.002019 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.002039 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.002061 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.002083 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:43Z","lastTransitionTime":"2025-12-02T19:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.106155 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.106205 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.106224 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.106245 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.106259 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:43Z","lastTransitionTime":"2025-12-02T19:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.217047 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.217089 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.217141 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.217164 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.217175 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:43Z","lastTransitionTime":"2025-12-02T19:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.330284 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.330363 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.330380 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.330399 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.330411 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:43Z","lastTransitionTime":"2025-12-02T19:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.432640 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.432708 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.432770 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.432821 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.432848 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:43Z","lastTransitionTime":"2025-12-02T19:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.536523 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.536563 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.536574 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.536592 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.536606 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:43Z","lastTransitionTime":"2025-12-02T19:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.640067 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.640120 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.640136 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.640159 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.640175 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:43Z","lastTransitionTime":"2025-12-02T19:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.743354 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.743428 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.743444 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.743468 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.743487 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:43Z","lastTransitionTime":"2025-12-02T19:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.846710 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.846773 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.846786 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.846804 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.846814 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:43Z","lastTransitionTime":"2025-12-02T19:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.949922 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.949997 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.950023 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.950055 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.950077 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:43Z","lastTransitionTime":"2025-12-02T19:58:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.972143 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.972192 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.972206 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:43 crc kubenswrapper[4807]: E1202 19:58:43.972315 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:43 crc kubenswrapper[4807]: I1202 19:58:43.972353 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:43 crc kubenswrapper[4807]: E1202 19:58:43.972545 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:43 crc kubenswrapper[4807]: E1202 19:58:43.972634 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:43 crc kubenswrapper[4807]: E1202 19:58:43.972710 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.053926 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.053986 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.054004 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.054033 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.054056 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:44Z","lastTransitionTime":"2025-12-02T19:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.157382 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.157452 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.157475 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.157506 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.157529 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:44Z","lastTransitionTime":"2025-12-02T19:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.261154 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.261192 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.261203 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.261219 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.261229 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:44Z","lastTransitionTime":"2025-12-02T19:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.364257 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.364305 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.364316 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.364334 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.364349 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:44Z","lastTransitionTime":"2025-12-02T19:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.468644 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.468784 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.468804 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.468857 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.468941 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:44Z","lastTransitionTime":"2025-12-02T19:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.571797 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.571878 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.571932 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.571953 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.571966 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:44Z","lastTransitionTime":"2025-12-02T19:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.675046 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.675107 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.675119 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.675139 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.675152 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:44Z","lastTransitionTime":"2025-12-02T19:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.778182 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.778248 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.778262 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.778277 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.778290 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:44Z","lastTransitionTime":"2025-12-02T19:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.881478 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.881582 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.881599 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.881622 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.881644 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:44Z","lastTransitionTime":"2025-12-02T19:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.983932 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.983990 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.984004 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.984023 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.984059 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:44Z","lastTransitionTime":"2025-12-02T19:58:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:44 crc kubenswrapper[4807]: I1202 19:58:44.992371 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:44Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.016529 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.033644 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.052970 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.069832 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.083923 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.086499 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.086556 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.086569 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.086588 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.086603 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:45Z","lastTransitionTime":"2025-12-02T19:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.105981 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.121483 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.134806 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46fe6ed5b39d7061be6cc035f987b35f4c2e8efd8c958942903cd2ccd21ae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecaa8988ab1f5c2702147a6aee69a4ccdc28be18c5ec72fc0c0a3fc4c9d319cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.151033 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb49a08-30b0-4353-ad4a-23362f281475\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7z9t6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.175849 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.189676 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.189975 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.190037 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.190120 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.190215 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:45Z","lastTransitionTime":"2025-12-02T19:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.195246 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.213013 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a51f0b9b-8269-4d6f-b5ec-55870461ebe1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://287123147f1562edac9044a1b110e49241e677f8df9ac9987c0865d0d202f9de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8def930722eb9a717d4c8945163ad1313a7912ac511b4a4c66b942a012afb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6993537ec825ca91163121a75d7129394b78f55140f7e4f96339a6609486f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.234516 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.253113 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.274947 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45520c95c5790f89947aa9678d1b50884f4c7dd17ecc1a2d0051b5062e9f7700\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:16Z\\\",\\\"message\\\":\\\"51 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1202 19:58:16.083988 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1202 19:58:16.084011 6251 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1202 19:58:16.084019 6251 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:16Z is after 2025-08-24T17:21:41Z]\\\\nI1202 19:58:16.084025 6251 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/netwo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:34Z\\\",\\\"message\\\":\\\", AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 19:58:33.879388 6461 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start defa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.292814 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:45Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.293643 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.293707 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.293761 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.293793 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.293819 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:45Z","lastTransitionTime":"2025-12-02T19:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.399943 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.400019 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.400041 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.400071 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.400089 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:45Z","lastTransitionTime":"2025-12-02T19:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.503921 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.504637 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.504745 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.504790 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.504818 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:45Z","lastTransitionTime":"2025-12-02T19:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.608385 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.608461 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.608490 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.608522 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.608545 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:45Z","lastTransitionTime":"2025-12-02T19:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.711835 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.711927 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.711945 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.711970 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.711991 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:45Z","lastTransitionTime":"2025-12-02T19:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.815006 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.815057 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.815069 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.815085 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.815098 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:45Z","lastTransitionTime":"2025-12-02T19:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.917097 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.917135 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.917146 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.917161 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.917172 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:45Z","lastTransitionTime":"2025-12-02T19:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.971591 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.971705 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:45 crc kubenswrapper[4807]: E1202 19:58:45.971784 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.971605 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:45 crc kubenswrapper[4807]: E1202 19:58:45.971912 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:45 crc kubenswrapper[4807]: I1202 19:58:45.971624 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:45 crc kubenswrapper[4807]: E1202 19:58:45.972020 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:45 crc kubenswrapper[4807]: E1202 19:58:45.972099 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.019516 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.019556 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.019568 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.019583 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.019593 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:46Z","lastTransitionTime":"2025-12-02T19:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.122173 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.122219 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.122229 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.122247 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.122257 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:46Z","lastTransitionTime":"2025-12-02T19:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.225516 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.225579 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.225597 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.225649 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.225665 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:46Z","lastTransitionTime":"2025-12-02T19:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.328050 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.328117 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.328128 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.328152 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.328164 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:46Z","lastTransitionTime":"2025-12-02T19:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.431928 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.431995 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.432011 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.432039 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.432060 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:46Z","lastTransitionTime":"2025-12-02T19:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.534482 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.534535 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.534545 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.534559 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.534579 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:46Z","lastTransitionTime":"2025-12-02T19:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.636817 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.636872 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.636891 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.636915 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.636930 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:46Z","lastTransitionTime":"2025-12-02T19:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.739455 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.739508 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.739523 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.739545 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.739563 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:46Z","lastTransitionTime":"2025-12-02T19:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.841788 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.841846 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.841866 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.841888 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.841905 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:46Z","lastTransitionTime":"2025-12-02T19:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.944212 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.944246 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.944257 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.944272 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:46 crc kubenswrapper[4807]: I1202 19:58:46.944284 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:46Z","lastTransitionTime":"2025-12-02T19:58:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.047186 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.047258 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.047274 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.047290 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.047301 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:47Z","lastTransitionTime":"2025-12-02T19:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.150423 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.150458 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.150468 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.150480 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.150488 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:47Z","lastTransitionTime":"2025-12-02T19:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.253953 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.254042 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.254063 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.254091 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.254110 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:47Z","lastTransitionTime":"2025-12-02T19:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.356802 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.356858 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.356870 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.356889 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.356902 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:47Z","lastTransitionTime":"2025-12-02T19:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.459852 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.459916 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.459929 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.459949 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.459983 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:47Z","lastTransitionTime":"2025-12-02T19:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.562555 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.562595 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.562604 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.562617 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.562626 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:47Z","lastTransitionTime":"2025-12-02T19:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.665135 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.665169 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.665177 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.665191 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.665200 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:47Z","lastTransitionTime":"2025-12-02T19:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.768567 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.768645 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.768664 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.768692 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.768742 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:47Z","lastTransitionTime":"2025-12-02T19:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.872522 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.872585 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.872601 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.872624 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.872639 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:47Z","lastTransitionTime":"2025-12-02T19:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.971903 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.971952 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.971998 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.971952 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:47 crc kubenswrapper[4807]: E1202 19:58:47.972109 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:47 crc kubenswrapper[4807]: E1202 19:58:47.972195 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:47 crc kubenswrapper[4807]: E1202 19:58:47.972296 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:47 crc kubenswrapper[4807]: E1202 19:58:47.972365 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.975165 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.975199 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.975211 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.975229 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:47 crc kubenswrapper[4807]: I1202 19:58:47.975244 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:47Z","lastTransitionTime":"2025-12-02T19:58:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.078250 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.078318 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.078334 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.078354 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.078370 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:48Z","lastTransitionTime":"2025-12-02T19:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.181638 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.181705 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.181738 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.181761 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.181779 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:48Z","lastTransitionTime":"2025-12-02T19:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.284387 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.284435 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.284446 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.284470 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.284481 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:48Z","lastTransitionTime":"2025-12-02T19:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.387598 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.387672 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.387696 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.387811 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.387835 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:48Z","lastTransitionTime":"2025-12-02T19:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.490255 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.490313 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.490330 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.490355 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.490374 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:48Z","lastTransitionTime":"2025-12-02T19:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.593061 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.593097 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.593105 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.593119 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.593147 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:48Z","lastTransitionTime":"2025-12-02T19:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.695419 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.695455 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.695465 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.695479 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.695489 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:48Z","lastTransitionTime":"2025-12-02T19:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.798007 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.798044 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.798052 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.798066 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.798075 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:48Z","lastTransitionTime":"2025-12-02T19:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.901023 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.901067 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.901080 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.901096 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.901109 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:48Z","lastTransitionTime":"2025-12-02T19:58:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.972675 4807 scope.go:117] "RemoveContainer" containerID="7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2" Dec 02 19:58:48 crc kubenswrapper[4807]: E1202 19:58:48.972994 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" Dec 02 19:58:48 crc kubenswrapper[4807]: I1202 19:58:48.990686 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:48Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.003344 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.003394 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.003405 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.003424 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.003439 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:49Z","lastTransitionTime":"2025-12-02T19:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.007772 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: E1202 19:58:49.016238 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.021055 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.021125 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.021142 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.021202 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.021216 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:49Z","lastTransitionTime":"2025-12-02T19:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.021962 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: E1202 19:58:49.032640 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.034844 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.037940 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.037987 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.037997 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.038016 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.038029 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:49Z","lastTransitionTime":"2025-12-02T19:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:49 crc kubenswrapper[4807]: E1202 19:58:49.053973 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.057795 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:34Z\\\",\\\"message\\\":\\\", AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 19:58:33.879388 6461 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start defa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.059656 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.059696 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.059708 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.059745 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.059757 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:49Z","lastTransitionTime":"2025-12-02T19:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:49 crc kubenswrapper[4807]: E1202 19:58:49.075921 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.077546 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.080054 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.080091 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.080102 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.080119 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.080130 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:49Z","lastTransitionTime":"2025-12-02T19:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.094187 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a51f0b9b-8269-4d6f-b5ec-55870461ebe1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://287123147f1562edac9044a1b110e49241e677f8df9ac9987c0865d0d202f9de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8def930722eb9a717d4c8945163ad1313a7912ac511b4a4c66b942a012afb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6993537ec825ca91163121a75d7129394b78f55140f7e4f96339a6609486f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: E1202 19:58:49.097221 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: E1202 19:58:49.097388 4807 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.099806 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.099859 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.099875 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.099893 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.099905 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:49Z","lastTransitionTime":"2025-12-02T19:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.112929 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.125957 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.138789 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.153188 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.164744 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.180115 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.194851 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.203218 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.203252 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.203262 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.203280 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.203293 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:49Z","lastTransitionTime":"2025-12-02T19:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.211940 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.226350 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb49a08-30b0-4353-ad4a-23362f281475\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7z9t6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.240191 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46fe6ed5b39d7061be6cc035f987b35f4c2e8efd8c958942903cd2ccd21ae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecaa8988ab1f5c2702147a6aee69a4ccdc28be18c5ec72fc0c0a3fc4c9d319cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:49Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.306682 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.306779 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.306797 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.306826 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.306846 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:49Z","lastTransitionTime":"2025-12-02T19:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.410222 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.410268 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.410279 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.410295 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.410307 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:49Z","lastTransitionTime":"2025-12-02T19:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.513111 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.513200 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.513212 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.513238 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.513252 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:49Z","lastTransitionTime":"2025-12-02T19:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.615771 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.615827 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.615845 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.615867 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.615889 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:49Z","lastTransitionTime":"2025-12-02T19:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.719444 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.719510 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.719531 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.719554 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.719571 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:49Z","lastTransitionTime":"2025-12-02T19:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.821996 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.822034 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.822045 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.822060 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.822069 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:49Z","lastTransitionTime":"2025-12-02T19:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.925487 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.925551 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.925564 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.925582 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.925596 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:49Z","lastTransitionTime":"2025-12-02T19:58:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.971538 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.971622 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:49 crc kubenswrapper[4807]: E1202 19:58:49.971662 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.971737 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:49 crc kubenswrapper[4807]: I1202 19:58:49.971775 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:49 crc kubenswrapper[4807]: E1202 19:58:49.971840 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:49 crc kubenswrapper[4807]: E1202 19:58:49.971880 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:49 crc kubenswrapper[4807]: E1202 19:58:49.971948 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.028957 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.029033 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.029046 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.029090 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.029102 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:50Z","lastTransitionTime":"2025-12-02T19:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.131392 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.131512 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.131536 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.131563 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.131609 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:50Z","lastTransitionTime":"2025-12-02T19:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.234173 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.234225 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.234242 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.234260 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.234271 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:50Z","lastTransitionTime":"2025-12-02T19:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.337314 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.337375 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.337387 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.337407 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.337419 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:50Z","lastTransitionTime":"2025-12-02T19:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.440418 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.440465 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.440477 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.440495 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.440511 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:50Z","lastTransitionTime":"2025-12-02T19:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.543264 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.543313 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.543323 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.543342 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.543358 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:50Z","lastTransitionTime":"2025-12-02T19:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.645246 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.645282 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.645292 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.645306 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.645318 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:50Z","lastTransitionTime":"2025-12-02T19:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.747887 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.747952 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.747963 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.747979 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.747991 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:50Z","lastTransitionTime":"2025-12-02T19:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.851426 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.851490 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.851504 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.851521 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.851532 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:50Z","lastTransitionTime":"2025-12-02T19:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.954028 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.954082 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.954095 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.954114 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:50 crc kubenswrapper[4807]: I1202 19:58:50.954128 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:50Z","lastTransitionTime":"2025-12-02T19:58:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.056140 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.056180 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.056191 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.056205 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.056216 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:51Z","lastTransitionTime":"2025-12-02T19:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.058438 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs\") pod \"network-metrics-daemon-7z9t6\" (UID: \"1cb49a08-30b0-4353-ad4a-23362f281475\") " pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:51 crc kubenswrapper[4807]: E1202 19:58:51.058548 4807 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 19:58:51 crc kubenswrapper[4807]: E1202 19:58:51.058593 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs podName:1cb49a08-30b0-4353-ad4a-23362f281475 nodeName:}" failed. No retries permitted until 2025-12-02 19:59:23.058580785 +0000 UTC m=+98.359488280 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs") pod "network-metrics-daemon-7z9t6" (UID: "1cb49a08-30b0-4353-ad4a-23362f281475") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.159074 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.159111 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.159123 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.159138 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.159149 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:51Z","lastTransitionTime":"2025-12-02T19:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.262465 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.262548 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.262562 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.262579 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.262590 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:51Z","lastTransitionTime":"2025-12-02T19:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.365143 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.365211 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.365223 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.365269 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.365283 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:51Z","lastTransitionTime":"2025-12-02T19:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.467884 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.467964 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.467975 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.467992 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.468003 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:51Z","lastTransitionTime":"2025-12-02T19:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.571263 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.571306 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.571317 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.571335 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.571347 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:51Z","lastTransitionTime":"2025-12-02T19:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.673922 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.673970 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.673982 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.673997 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.674007 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:51Z","lastTransitionTime":"2025-12-02T19:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.776357 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.776399 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.776410 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.776427 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.776442 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:51Z","lastTransitionTime":"2025-12-02T19:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.879033 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.879086 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.879101 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.879119 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.879134 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:51Z","lastTransitionTime":"2025-12-02T19:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.972321 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.972325 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.972322 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:51 crc kubenswrapper[4807]: E1202 19:58:51.973019 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:51 crc kubenswrapper[4807]: E1202 19:58:51.972827 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.972498 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:51 crc kubenswrapper[4807]: E1202 19:58:51.973227 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:51 crc kubenswrapper[4807]: E1202 19:58:51.973253 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.981420 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.981476 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.981487 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.981503 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:51 crc kubenswrapper[4807]: I1202 19:58:51.981512 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:51Z","lastTransitionTime":"2025-12-02T19:58:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.084006 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.084042 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.084052 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.084066 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.084078 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:52Z","lastTransitionTime":"2025-12-02T19:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.186797 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.186839 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.186847 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.186863 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.186873 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:52Z","lastTransitionTime":"2025-12-02T19:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.289657 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.289707 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.289738 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.289759 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.289774 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:52Z","lastTransitionTime":"2025-12-02T19:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.392411 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.392482 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.392497 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.392516 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.392530 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:52Z","lastTransitionTime":"2025-12-02T19:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.478323 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f5x8r_8a909a25-5ede-458e-af78-4a41b79716a5/kube-multus/0.log" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.478403 4807 generic.go:334] "Generic (PLEG): container finished" podID="8a909a25-5ede-458e-af78-4a41b79716a5" containerID="2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828" exitCode=1 Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.478448 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f5x8r" event={"ID":"8a909a25-5ede-458e-af78-4a41b79716a5","Type":"ContainerDied","Data":"2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828"} Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.479052 4807 scope.go:117] "RemoveContainer" containerID="2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.494894 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.498303 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.498365 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.498378 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.498401 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.498418 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:52Z","lastTransitionTime":"2025-12-02T19:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.509902 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.521533 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.534831 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:52Z\\\",\\\"message\\\":\\\"2025-12-02T19:58:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d6c3df64-c664-42d0-aca5-347b9f685d72\\\\n2025-12-02T19:58:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d6c3df64-c664-42d0-aca5-347b9f685d72 to /host/opt/cni/bin/\\\\n2025-12-02T19:58:06Z [verbose] multus-daemon started\\\\n2025-12-02T19:58:06Z [verbose] Readiness Indicator file check\\\\n2025-12-02T19:58:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.546070 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.560631 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.570306 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb49a08-30b0-4353-ad4a-23362f281475\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7z9t6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.582783 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46fe6ed5b39d7061be6cc035f987b35f4c2e8efd8c958942903cd2ccd21ae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecaa8988ab1f5c2702147a6aee69a4ccdc28be18c5ec72fc0c0a3fc4c9d319cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.597229 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.600682 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.600711 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.600745 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.600761 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.600773 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:52Z","lastTransitionTime":"2025-12-02T19:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.610686 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.629742 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.641242 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.675167 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:34Z\\\",\\\"message\\\":\\\", AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 19:58:33.879388 6461 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start defa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.703279 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.703338 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.703351 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.703370 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.703408 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:52Z","lastTransitionTime":"2025-12-02T19:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.704197 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.725054 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a51f0b9b-8269-4d6f-b5ec-55870461ebe1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://287123147f1562edac9044a1b110e49241e677f8df9ac9987c0865d0d202f9de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8def930722eb9a717d4c8945163ad1313a7912ac511b4a4c66b942a012afb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6993537ec825ca91163121a75d7129394b78f55140f7e4f96339a6609486f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.742615 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.754831 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:52Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.806632 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.806682 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.806691 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.806705 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.806741 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:52Z","lastTransitionTime":"2025-12-02T19:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.909422 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.909452 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.909460 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.909476 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:52 crc kubenswrapper[4807]: I1202 19:58:52.909488 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:52Z","lastTransitionTime":"2025-12-02T19:58:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.011520 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.011562 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.011575 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.011593 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.011604 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:53Z","lastTransitionTime":"2025-12-02T19:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.114247 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.114300 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.114312 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.114327 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.114338 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:53Z","lastTransitionTime":"2025-12-02T19:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.217036 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.217083 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.217094 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.217110 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.217121 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:53Z","lastTransitionTime":"2025-12-02T19:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.319887 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.319944 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.319962 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.319984 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.320001 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:53Z","lastTransitionTime":"2025-12-02T19:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.423297 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.423345 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.423356 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.423372 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.423384 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:53Z","lastTransitionTime":"2025-12-02T19:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.483000 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f5x8r_8a909a25-5ede-458e-af78-4a41b79716a5/kube-multus/0.log" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.483062 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f5x8r" event={"ID":"8a909a25-5ede-458e-af78-4a41b79716a5","Type":"ContainerStarted","Data":"c3d0b1c29f6c59ee7f954123d78e57d3bc99b5d9618ebbf455d6b4cb71a08d63"} Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.503461 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a51f0b9b-8269-4d6f-b5ec-55870461ebe1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://287123147f1562edac9044a1b110e49241e677f8df9ac9987c0865d0d202f9de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8def930722eb9a717d4c8945163ad1313a7912ac511b4a4c66b942a012afb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6993537ec825ca91163121a75d7129394b78f55140f7e4f96339a6609486f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.516409 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.525642 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.525705 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.525742 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.525767 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.525778 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:53Z","lastTransitionTime":"2025-12-02T19:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.530933 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.549213 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:34Z\\\",\\\"message\\\":\\\", AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 19:58:33.879388 6461 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start defa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.559553 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.574841 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.589252 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.601337 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.616600 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d0b1c29f6c59ee7f954123d78e57d3bc99b5d9618ebbf455d6b4cb71a08d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:52Z\\\",\\\"message\\\":\\\"2025-12-02T19:58:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d6c3df64-c664-42d0-aca5-347b9f685d72\\\\n2025-12-02T19:58:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d6c3df64-c664-42d0-aca5-347b9f685d72 to /host/opt/cni/bin/\\\\n2025-12-02T19:58:06Z [verbose] multus-daemon started\\\\n2025-12-02T19:58:06Z [verbose] Readiness Indicator file check\\\\n2025-12-02T19:58:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.626768 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.627897 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.627939 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.627948 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.627962 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.627972 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:53Z","lastTransitionTime":"2025-12-02T19:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.640296 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.653959 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.666504 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.678245 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46fe6ed5b39d7061be6cc035f987b35f4c2e8efd8c958942903cd2ccd21ae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecaa8988ab1f5c2702147a6aee69a4ccdc28be18c5ec72fc0c0a3fc4c9d319cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.694086 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb49a08-30b0-4353-ad4a-23362f281475\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7z9t6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.713806 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.730158 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:53Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.730695 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.730762 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.730777 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.730797 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.730811 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:53Z","lastTransitionTime":"2025-12-02T19:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.833635 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.833698 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.833735 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.833764 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.833788 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:53Z","lastTransitionTime":"2025-12-02T19:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.936767 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.936818 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.936828 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.936843 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.936854 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:53Z","lastTransitionTime":"2025-12-02T19:58:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.971534 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.971606 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.971561 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:53 crc kubenswrapper[4807]: I1202 19:58:53.971787 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:53 crc kubenswrapper[4807]: E1202 19:58:53.972020 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:53 crc kubenswrapper[4807]: E1202 19:58:53.972080 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:53 crc kubenswrapper[4807]: E1202 19:58:53.971937 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:53 crc kubenswrapper[4807]: E1202 19:58:53.972243 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.040061 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.040126 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.040136 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.040155 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.040168 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:54Z","lastTransitionTime":"2025-12-02T19:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.142666 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.142737 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.142750 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.142766 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.142775 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:54Z","lastTransitionTime":"2025-12-02T19:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.245788 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.246063 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.246125 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.246243 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.246303 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:54Z","lastTransitionTime":"2025-12-02T19:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.349166 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.349443 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.349574 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.349675 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.349809 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:54Z","lastTransitionTime":"2025-12-02T19:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.452288 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.452329 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.452337 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.452352 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.452362 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:54Z","lastTransitionTime":"2025-12-02T19:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.555279 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.555336 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.555352 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.555373 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.555387 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:54Z","lastTransitionTime":"2025-12-02T19:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.658084 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.658130 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.658144 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.658162 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.658176 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:54Z","lastTransitionTime":"2025-12-02T19:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.761387 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.761442 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.761455 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.761474 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.761488 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:54Z","lastTransitionTime":"2025-12-02T19:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.863600 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.863656 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.863673 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.863696 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.863737 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:54Z","lastTransitionTime":"2025-12-02T19:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.966689 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.966763 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.966780 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.966803 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.966820 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:54Z","lastTransitionTime":"2025-12-02T19:58:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:54 crc kubenswrapper[4807]: I1202 19:58:54.986706 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:54Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.001226 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:54Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.012849 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.021690 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.032960 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d0b1c29f6c59ee7f954123d78e57d3bc99b5d9618ebbf455d6b4cb71a08d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:52Z\\\",\\\"message\\\":\\\"2025-12-02T19:58:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d6c3df64-c664-42d0-aca5-347b9f685d72\\\\n2025-12-02T19:58:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d6c3df64-c664-42d0-aca5-347b9f685d72 to /host/opt/cni/bin/\\\\n2025-12-02T19:58:06Z [verbose] multus-daemon started\\\\n2025-12-02T19:58:06Z [verbose] Readiness Indicator file check\\\\n2025-12-02T19:58:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.041961 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.053140 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.065222 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.068625 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.068651 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.068661 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.068674 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.068684 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:55Z","lastTransitionTime":"2025-12-02T19:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.074983 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46fe6ed5b39d7061be6cc035f987b35f4c2e8efd8c958942903cd2ccd21ae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecaa8988ab1f5c2702147a6aee69a4ccdc28be18c5ec72fc0c0a3fc4c9d319cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.085543 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb49a08-30b0-4353-ad4a-23362f281475\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7z9t6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.097887 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.111484 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.122053 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.132784 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a51f0b9b-8269-4d6f-b5ec-55870461ebe1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://287123147f1562edac9044a1b110e49241e677f8df9ac9987c0865d0d202f9de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8def930722eb9a717d4c8945163ad1313a7912ac511b4a4c66b942a012afb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6993537ec825ca91163121a75d7129394b78f55140f7e4f96339a6609486f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.144931 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.159227 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.171149 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.171207 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.171224 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.171247 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.171265 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:55Z","lastTransitionTime":"2025-12-02T19:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.181623 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:34Z\\\",\\\"message\\\":\\\", AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 19:58:33.879388 6461 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start defa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:55Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.274286 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.274333 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.274347 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.274362 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.274373 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:55Z","lastTransitionTime":"2025-12-02T19:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.377089 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.377134 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.377146 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.377165 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.377179 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:55Z","lastTransitionTime":"2025-12-02T19:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.480021 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.480054 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.480064 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.480076 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.480084 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:55Z","lastTransitionTime":"2025-12-02T19:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.583228 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.583274 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.583283 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.583298 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.583309 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:55Z","lastTransitionTime":"2025-12-02T19:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.685651 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.685689 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.685739 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.685761 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.685773 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:55Z","lastTransitionTime":"2025-12-02T19:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.788079 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.788149 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.788167 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.788194 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.788212 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:55Z","lastTransitionTime":"2025-12-02T19:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.891142 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.891192 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.891205 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.891225 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.891238 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:55Z","lastTransitionTime":"2025-12-02T19:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.971617 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.971802 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:55 crc kubenswrapper[4807]: E1202 19:58:55.971939 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.971979 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.972163 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:55 crc kubenswrapper[4807]: E1202 19:58:55.972154 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:55 crc kubenswrapper[4807]: E1202 19:58:55.972287 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:55 crc kubenswrapper[4807]: E1202 19:58:55.972508 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.998215 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.998257 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.998269 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.998287 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:55 crc kubenswrapper[4807]: I1202 19:58:55.998307 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:55Z","lastTransitionTime":"2025-12-02T19:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.101803 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.101866 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.101880 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.101902 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.101915 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:56Z","lastTransitionTime":"2025-12-02T19:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.204600 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.204640 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.204653 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.204668 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.204678 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:56Z","lastTransitionTime":"2025-12-02T19:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.307163 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.307199 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.307209 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.307225 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.307236 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:56Z","lastTransitionTime":"2025-12-02T19:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.410105 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.410191 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.410208 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.410225 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.410237 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:56Z","lastTransitionTime":"2025-12-02T19:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.512712 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.512806 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.512824 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.512847 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.512865 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:56Z","lastTransitionTime":"2025-12-02T19:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.615621 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.615659 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.615669 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.615684 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.615695 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:56Z","lastTransitionTime":"2025-12-02T19:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.718601 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.718644 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.718655 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.718672 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.718687 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:56Z","lastTransitionTime":"2025-12-02T19:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.821315 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.821369 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.821380 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.821395 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.821405 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:56Z","lastTransitionTime":"2025-12-02T19:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.925431 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.925471 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.925484 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.925500 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:56 crc kubenswrapper[4807]: I1202 19:58:56.925511 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:56Z","lastTransitionTime":"2025-12-02T19:58:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.028541 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.028602 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.028617 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.028638 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.028654 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:57Z","lastTransitionTime":"2025-12-02T19:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.131132 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.131174 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.131183 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.131198 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.131211 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:57Z","lastTransitionTime":"2025-12-02T19:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.234283 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.234335 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.234346 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.234360 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.234369 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:57Z","lastTransitionTime":"2025-12-02T19:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.337626 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.337692 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.337711 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.337780 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.337804 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:57Z","lastTransitionTime":"2025-12-02T19:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.440976 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.441019 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.441030 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.441045 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.441058 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:57Z","lastTransitionTime":"2025-12-02T19:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.543518 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.543574 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.543587 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.543606 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.543621 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:57Z","lastTransitionTime":"2025-12-02T19:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.646292 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.646341 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.646354 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.646372 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.646384 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:57Z","lastTransitionTime":"2025-12-02T19:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.748481 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.748537 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.748555 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.748593 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.748614 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:57Z","lastTransitionTime":"2025-12-02T19:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.851402 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.851441 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.851454 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.851471 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.851483 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:57Z","lastTransitionTime":"2025-12-02T19:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.954643 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.954714 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.954771 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.954801 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.954825 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:57Z","lastTransitionTime":"2025-12-02T19:58:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.971845 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.971985 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.972119 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:57 crc kubenswrapper[4807]: E1202 19:58:57.972132 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:57 crc kubenswrapper[4807]: E1202 19:58:57.972255 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:57 crc kubenswrapper[4807]: E1202 19:58:57.972354 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:57 crc kubenswrapper[4807]: I1202 19:58:57.972503 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:57 crc kubenswrapper[4807]: E1202 19:58:57.972826 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.057995 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.058047 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.058063 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.058082 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.058098 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:58Z","lastTransitionTime":"2025-12-02T19:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.160890 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.160933 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.160951 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.160974 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.160993 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:58Z","lastTransitionTime":"2025-12-02T19:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.263429 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.263517 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.263548 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.263586 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.263610 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:58Z","lastTransitionTime":"2025-12-02T19:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.366345 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.366410 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.366430 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.366458 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.366475 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:58Z","lastTransitionTime":"2025-12-02T19:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.469377 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.469424 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.469435 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.469452 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.469463 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:58Z","lastTransitionTime":"2025-12-02T19:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.572599 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.572669 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.572683 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.572703 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.572736 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:58Z","lastTransitionTime":"2025-12-02T19:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.676229 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.676306 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.676327 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.676357 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.676385 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:58Z","lastTransitionTime":"2025-12-02T19:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.779562 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.779619 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.779633 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.779656 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.779697 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:58Z","lastTransitionTime":"2025-12-02T19:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.882984 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.883037 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.883050 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.883070 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.883082 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:58Z","lastTransitionTime":"2025-12-02T19:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.985880 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.985944 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.985957 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.985977 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:58 crc kubenswrapper[4807]: I1202 19:58:58.985992 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:58Z","lastTransitionTime":"2025-12-02T19:58:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.089571 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.089642 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.089655 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.089677 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.089691 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:59Z","lastTransitionTime":"2025-12-02T19:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.193127 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.193172 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.193184 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.193202 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.193216 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:59Z","lastTransitionTime":"2025-12-02T19:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.296607 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.296826 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.296877 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.296907 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.296930 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:59Z","lastTransitionTime":"2025-12-02T19:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.386625 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.386666 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.386697 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.386711 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.386761 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:59Z","lastTransitionTime":"2025-12-02T19:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:59 crc kubenswrapper[4807]: E1202 19:58:59.402287 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:59Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.407205 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.407262 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.407279 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.407302 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.407319 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:59Z","lastTransitionTime":"2025-12-02T19:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:59 crc kubenswrapper[4807]: E1202 19:58:59.421644 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:59Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.425755 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.425782 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.425792 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.425806 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.425815 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:59Z","lastTransitionTime":"2025-12-02T19:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:59 crc kubenswrapper[4807]: E1202 19:58:59.439586 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:59Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.445981 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.446050 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.446066 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.446096 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.446112 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:59Z","lastTransitionTime":"2025-12-02T19:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:59 crc kubenswrapper[4807]: E1202 19:58:59.464842 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:59Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.469658 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.469745 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.469760 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.469779 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.469795 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:59Z","lastTransitionTime":"2025-12-02T19:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:59 crc kubenswrapper[4807]: E1202 19:58:59.487266 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:58:59Z is after 2025-08-24T17:21:41Z" Dec 02 19:58:59 crc kubenswrapper[4807]: E1202 19:58:59.487438 4807 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.489441 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.489480 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.489492 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.489515 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.489529 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:59Z","lastTransitionTime":"2025-12-02T19:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.593927 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.594389 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.594408 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.594467 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.594490 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:59Z","lastTransitionTime":"2025-12-02T19:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.698266 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.698328 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.698348 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.698371 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.698390 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:59Z","lastTransitionTime":"2025-12-02T19:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.801447 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.801506 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.801518 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.801540 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.801552 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:59Z","lastTransitionTime":"2025-12-02T19:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.903846 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.903905 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.903923 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.903945 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.903963 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:58:59Z","lastTransitionTime":"2025-12-02T19:58:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.972211 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.972293 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.972294 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:58:59 crc kubenswrapper[4807]: I1202 19:58:59.972294 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:58:59 crc kubenswrapper[4807]: E1202 19:58:59.972481 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:58:59 crc kubenswrapper[4807]: E1202 19:58:59.972626 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:58:59 crc kubenswrapper[4807]: E1202 19:58:59.972755 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:58:59 crc kubenswrapper[4807]: E1202 19:58:59.972824 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.008081 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.008209 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.008222 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.008243 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.008258 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:00Z","lastTransitionTime":"2025-12-02T19:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.111568 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.111628 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.111642 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.111664 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.111680 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:00Z","lastTransitionTime":"2025-12-02T19:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.215207 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.215287 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.215307 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.215335 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.215357 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:00Z","lastTransitionTime":"2025-12-02T19:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.322261 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.322299 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.322308 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.322322 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.322330 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:00Z","lastTransitionTime":"2025-12-02T19:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.425239 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.425358 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.425379 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.425406 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.425430 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:00Z","lastTransitionTime":"2025-12-02T19:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.528631 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.528699 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.528711 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.528753 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.528767 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:00Z","lastTransitionTime":"2025-12-02T19:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.632807 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.632871 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.632887 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.632911 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.632928 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:00Z","lastTransitionTime":"2025-12-02T19:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.736094 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.736191 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.736216 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.736245 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.736266 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:00Z","lastTransitionTime":"2025-12-02T19:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.840393 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.840480 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.840505 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.840543 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.840569 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:00Z","lastTransitionTime":"2025-12-02T19:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.943989 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.944051 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.944065 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.944097 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:00 crc kubenswrapper[4807]: I1202 19:59:00.944114 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:00Z","lastTransitionTime":"2025-12-02T19:59:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.047803 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.047881 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.047900 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.047925 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.047943 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:01Z","lastTransitionTime":"2025-12-02T19:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.150899 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.150944 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.150979 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.150996 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.151007 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:01Z","lastTransitionTime":"2025-12-02T19:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.254385 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.254453 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.254470 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.254495 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.254517 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:01Z","lastTransitionTime":"2025-12-02T19:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.358659 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.358758 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.358776 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.358803 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.358825 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:01Z","lastTransitionTime":"2025-12-02T19:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.462486 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.462557 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.462574 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.462598 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.462616 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:01Z","lastTransitionTime":"2025-12-02T19:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.565429 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.565498 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.565520 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.565550 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.565572 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:01Z","lastTransitionTime":"2025-12-02T19:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.668878 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.668950 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.668975 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.669000 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.669017 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:01Z","lastTransitionTime":"2025-12-02T19:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.772703 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.772816 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.772841 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.772870 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.772892 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:01Z","lastTransitionTime":"2025-12-02T19:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.876311 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.876378 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.876396 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.876420 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.876440 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:01Z","lastTransitionTime":"2025-12-02T19:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.972168 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.972234 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.972276 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.972372 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:01 crc kubenswrapper[4807]: E1202 19:59:01.972363 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:01 crc kubenswrapper[4807]: E1202 19:59:01.972525 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:01 crc kubenswrapper[4807]: E1202 19:59:01.972595 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:01 crc kubenswrapper[4807]: E1202 19:59:01.972706 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.979540 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.979587 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.979603 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.979621 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:01 crc kubenswrapper[4807]: I1202 19:59:01.979635 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:01Z","lastTransitionTime":"2025-12-02T19:59:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.082911 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.082996 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.083025 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.083052 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.083073 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:02Z","lastTransitionTime":"2025-12-02T19:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.186706 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.186846 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.186885 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.186974 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.187001 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:02Z","lastTransitionTime":"2025-12-02T19:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.290522 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.290576 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.290585 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.290604 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.290614 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:02Z","lastTransitionTime":"2025-12-02T19:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.393949 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.394027 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.394047 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.394075 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.394095 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:02Z","lastTransitionTime":"2025-12-02T19:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.497584 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.497646 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.497665 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.497689 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.497713 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:02Z","lastTransitionTime":"2025-12-02T19:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.600659 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.600729 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.600740 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.600756 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.600766 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:02Z","lastTransitionTime":"2025-12-02T19:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.704514 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.704580 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.704603 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.704624 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.704642 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:02Z","lastTransitionTime":"2025-12-02T19:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.807514 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.807572 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.807589 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.807606 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.807621 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:02Z","lastTransitionTime":"2025-12-02T19:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.910676 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.910748 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.910758 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.910778 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.910788 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:02Z","lastTransitionTime":"2025-12-02T19:59:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.973661 4807 scope.go:117] "RemoveContainer" containerID="7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2" Dec 02 19:59:02 crc kubenswrapper[4807]: I1202 19:59:02.988116 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.013908 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.013957 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.013968 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.013986 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.013999 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:03Z","lastTransitionTime":"2025-12-02T19:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.116747 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.117418 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.117758 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.117786 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.117801 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:03Z","lastTransitionTime":"2025-12-02T19:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.220343 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.220438 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.220470 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.220501 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.220524 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:03Z","lastTransitionTime":"2025-12-02T19:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.328688 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.328757 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.328769 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.328786 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.328800 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:03Z","lastTransitionTime":"2025-12-02T19:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.437488 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.437535 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.437545 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.437561 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.437574 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:03Z","lastTransitionTime":"2025-12-02T19:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.540483 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.540555 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.540573 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.540598 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.540618 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:03Z","lastTransitionTime":"2025-12-02T19:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.644392 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.644447 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.644459 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.644480 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.644494 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:03Z","lastTransitionTime":"2025-12-02T19:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.747692 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.747826 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.747846 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.747871 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.747891 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:03Z","lastTransitionTime":"2025-12-02T19:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.851063 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.851118 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.851130 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.851153 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.851173 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:03Z","lastTransitionTime":"2025-12-02T19:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.954446 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.954520 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.954538 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.954565 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.954584 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:03Z","lastTransitionTime":"2025-12-02T19:59:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.971791 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.971828 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.971881 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:03 crc kubenswrapper[4807]: E1202 19:59:03.971937 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:03 crc kubenswrapper[4807]: I1202 19:59:03.971981 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:03 crc kubenswrapper[4807]: E1202 19:59:03.972104 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:03 crc kubenswrapper[4807]: E1202 19:59:03.972300 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:03 crc kubenswrapper[4807]: E1202 19:59:03.972530 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.057798 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.057866 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.057882 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.057906 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.057923 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:04Z","lastTransitionTime":"2025-12-02T19:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.161457 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.161524 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.161536 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.161558 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.161592 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:04Z","lastTransitionTime":"2025-12-02T19:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.264851 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.264914 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.264934 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.264958 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.264977 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:04Z","lastTransitionTime":"2025-12-02T19:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.367421 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.367468 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.367480 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.367500 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.367512 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:04Z","lastTransitionTime":"2025-12-02T19:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.471180 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.471237 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.471246 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.471264 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.471275 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:04Z","lastTransitionTime":"2025-12-02T19:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.528096 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/2.log" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.531922 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerStarted","Data":"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c"} Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.533269 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.549956 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.570410 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.576144 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.576219 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.576243 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.576270 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.576289 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:04Z","lastTransitionTime":"2025-12-02T19:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.588702 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.606424 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.622510 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.638535 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d0b1c29f6c59ee7f954123d78e57d3bc99b5d9618ebbf455d6b4cb71a08d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:52Z\\\",\\\"message\\\":\\\"2025-12-02T19:58:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d6c3df64-c664-42d0-aca5-347b9f685d72\\\\n2025-12-02T19:58:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d6c3df64-c664-42d0-aca5-347b9f685d72 to /host/opt/cni/bin/\\\\n2025-12-02T19:58:06Z [verbose] multus-daemon started\\\\n2025-12-02T19:58:06Z [verbose] Readiness Indicator file check\\\\n2025-12-02T19:58:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.655648 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.671411 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.679429 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.679474 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.679483 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.679500 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.679512 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:04Z","lastTransitionTime":"2025-12-02T19:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.684905 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32300666-1489-4630-a690-2f438df4750f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70220dddbe9bd2bfff2d1e664146673a1d2f33b9dc08f717241446e2baa60d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b067e9a8d2652bfdd33e8e62427cfc1d871c6d170422a45d5676fbc4cd415f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b067e9a8d2652bfdd33e8e62427cfc1d871c6d170422a45d5676fbc4cd415f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.698981 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46fe6ed5b39d7061be6cc035f987b35f4c2e8efd8c958942903cd2ccd21ae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecaa8988ab1f5c2702147a6aee69a4ccdc28be18c5ec72fc0c0a3fc4c9d319cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.712954 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb49a08-30b0-4353-ad4a-23362f281475\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7z9t6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.728858 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.744872 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.761010 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.780452 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:34Z\\\",\\\"message\\\":\\\", AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 19:58:33.879388 6461 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start defa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.782241 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.782280 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.782297 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.782318 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.782330 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:04Z","lastTransitionTime":"2025-12-02T19:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.794536 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.807962 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a51f0b9b-8269-4d6f-b5ec-55870461ebe1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://287123147f1562edac9044a1b110e49241e677f8df9ac9987c0865d0d202f9de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8def930722eb9a717d4c8945163ad1313a7912ac511b4a4c66b942a012afb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6993537ec825ca91163121a75d7129394b78f55140f7e4f96339a6609486f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.833876 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.885579 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.885953 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.886034 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.886107 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.886174 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:04Z","lastTransitionTime":"2025-12-02T19:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.989185 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.989299 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.989323 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.989352 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.989372 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:04Z","lastTransitionTime":"2025-12-02T19:59:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:04 crc kubenswrapper[4807]: I1202 19:59:04.990070 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:04Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.022735 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:34Z\\\",\\\"message\\\":\\\", AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 19:58:33.879388 6461 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start defa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.045392 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.067049 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a51f0b9b-8269-4d6f-b5ec-55870461ebe1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://287123147f1562edac9044a1b110e49241e677f8df9ac9987c0865d0d202f9de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8def930722eb9a717d4c8945163ad1313a7912ac511b4a4c66b942a012afb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6993537ec825ca91163121a75d7129394b78f55140f7e4f96339a6609486f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.092679 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.092827 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.092880 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.092910 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.092962 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:05Z","lastTransitionTime":"2025-12-02T19:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.094909 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.120449 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.148473 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.169467 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.192511 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.195791 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.195843 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.195861 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.195883 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.195899 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:05Z","lastTransitionTime":"2025-12-02T19:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.209688 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.231778 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d0b1c29f6c59ee7f954123d78e57d3bc99b5d9618ebbf455d6b4cb71a08d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:52Z\\\",\\\"message\\\":\\\"2025-12-02T19:58:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d6c3df64-c664-42d0-aca5-347b9f685d72\\\\n2025-12-02T19:58:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d6c3df64-c664-42d0-aca5-347b9f685d72 to /host/opt/cni/bin/\\\\n2025-12-02T19:58:06Z [verbose] multus-daemon started\\\\n2025-12-02T19:58:06Z [verbose] Readiness Indicator file check\\\\n2025-12-02T19:58:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.248682 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.267673 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.281549 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32300666-1489-4630-a690-2f438df4750f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70220dddbe9bd2bfff2d1e664146673a1d2f33b9dc08f717241446e2baa60d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b067e9a8d2652bfdd33e8e62427cfc1d871c6d170422a45d5676fbc4cd415f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b067e9a8d2652bfdd33e8e62427cfc1d871c6d170422a45d5676fbc4cd415f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.298046 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.298106 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.298123 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.298143 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.298157 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:05Z","lastTransitionTime":"2025-12-02T19:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.298966 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46fe6ed5b39d7061be6cc035f987b35f4c2e8efd8c958942903cd2ccd21ae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecaa8988ab1f5c2702147a6aee69a4ccdc28be18c5ec72fc0c0a3fc4c9d319cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.316950 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb49a08-30b0-4353-ad4a-23362f281475\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7z9t6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.344957 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.363606 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.400340 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.400411 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.400438 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.400467 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.400490 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:05Z","lastTransitionTime":"2025-12-02T19:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.502856 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.502924 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.502949 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.502980 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.503004 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:05Z","lastTransitionTime":"2025-12-02T19:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.536530 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/3.log" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.537480 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/2.log" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.541837 4807 generic.go:334] "Generic (PLEG): container finished" podID="798a6158-a963-43b4-941e-ac4f3df2f883" containerID="4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c" exitCode=1 Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.541893 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerDied","Data":"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c"} Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.541968 4807 scope.go:117] "RemoveContainer" containerID="7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.543085 4807 scope.go:117] "RemoveContainer" containerID="4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c" Dec 02 19:59:05 crc kubenswrapper[4807]: E1202 19:59:05.543404 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" Dec 02 19:59:05 crc kubenswrapper[4807]: E1202 19:59:05.552742 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod798a6158_a963_43b4_941e_ac4f3df2f883.slice/crio-conmon-4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c.scope\": RecentStats: unable to find data in memory cache]" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.572117 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.589247 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.603226 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.605666 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.605803 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.605825 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.605855 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.605881 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:05Z","lastTransitionTime":"2025-12-02T19:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.623639 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d0b1c29f6c59ee7f954123d78e57d3bc99b5d9618ebbf455d6b4cb71a08d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:52Z\\\",\\\"message\\\":\\\"2025-12-02T19:58:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d6c3df64-c664-42d0-aca5-347b9f685d72\\\\n2025-12-02T19:58:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d6c3df64-c664-42d0-aca5-347b9f685d72 to /host/opt/cni/bin/\\\\n2025-12-02T19:58:06Z [verbose] multus-daemon started\\\\n2025-12-02T19:58:06Z [verbose] Readiness Indicator file check\\\\n2025-12-02T19:58:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.640431 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.661662 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.677872 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32300666-1489-4630-a690-2f438df4750f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70220dddbe9bd2bfff2d1e664146673a1d2f33b9dc08f717241446e2baa60d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b067e9a8d2652bfdd33e8e62427cfc1d871c6d170422a45d5676fbc4cd415f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b067e9a8d2652bfdd33e8e62427cfc1d871c6d170422a45d5676fbc4cd415f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.697136 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46fe6ed5b39d7061be6cc035f987b35f4c2e8efd8c958942903cd2ccd21ae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecaa8988ab1f5c2702147a6aee69a4ccdc28be18c5ec72fc0c0a3fc4c9d319cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.709439 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.709829 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.710063 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.710261 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.710489 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:05Z","lastTransitionTime":"2025-12-02T19:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.716818 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb49a08-30b0-4353-ad4a-23362f281475\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7z9t6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.742139 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.758629 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.780136 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e944068e1e5df350bd3e0087be7a7f5d38382c07641b1ca5e896158543973e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:34Z\\\",\\\"message\\\":\\\", AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1202 19:58:33.879388 6461 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start defa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:59:05Z\\\",\\\"message\\\":\\\"nformers/factory.go:160\\\\nI1202 19:59:04.844566 6824 services_controller.go:614] Adding service openshift-kube-controller-manager-operator/metrics for network=default\\\\nI1202 19:59:04.844572 6824 services_controller.go:614] Adding service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI1202 19:59:04.844562 6824 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:59:04.844589 6824 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:59:04.844600 6824 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 19:59:04.844611 6824 services_controller.go:614] Adding service openshift-console/console for network=default\\\\nI1202 19:59:04.844901 6824 factory.go:656] Stopping watch factory\\\\nI1202 19:59:04.844993 6824 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 19:59:04.845371 6824 ovnkube.go:599] Stopped ovnkube\\\\nI1202 19:59:04.845436 6824 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 19:59:04.845579 6824 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:59:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.797585 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.813803 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.813862 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.813880 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.813901 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.813916 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:05Z","lastTransitionTime":"2025-12-02T19:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.816484 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a51f0b9b-8269-4d6f-b5ec-55870461ebe1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://287123147f1562edac9044a1b110e49241e677f8df9ac9987c0865d0d202f9de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8def930722eb9a717d4c8945163ad1313a7912ac511b4a4c66b942a012afb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6993537ec825ca91163121a75d7129394b78f55140f7e4f96339a6609486f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.837244 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.855747 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.868486 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.886461 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:05Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.916991 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.917090 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.917118 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.917153 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.917177 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:05Z","lastTransitionTime":"2025-12-02T19:59:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.971941 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.971997 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.972075 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:05 crc kubenswrapper[4807]: E1202 19:59:05.972143 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:05 crc kubenswrapper[4807]: E1202 19:59:05.972248 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:05 crc kubenswrapper[4807]: I1202 19:59:05.972308 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:05 crc kubenswrapper[4807]: E1202 19:59:05.972354 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:05 crc kubenswrapper[4807]: E1202 19:59:05.972398 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.020561 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.020608 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.020619 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.020637 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.020651 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:06Z","lastTransitionTime":"2025-12-02T19:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.123652 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.123685 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.123695 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.123709 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.123735 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:06Z","lastTransitionTime":"2025-12-02T19:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.226678 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.226750 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.226767 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.226789 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.226805 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:06Z","lastTransitionTime":"2025-12-02T19:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.330026 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.330079 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.330095 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.330123 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.330141 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:06Z","lastTransitionTime":"2025-12-02T19:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.433381 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.433438 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.433460 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.433486 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.433504 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:06Z","lastTransitionTime":"2025-12-02T19:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.537298 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.537670 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.537758 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.537831 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.537922 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:06Z","lastTransitionTime":"2025-12-02T19:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.547697 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/3.log" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.553207 4807 scope.go:117] "RemoveContainer" containerID="4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c" Dec 02 19:59:06 crc kubenswrapper[4807]: E1202 19:59:06.553466 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.574118 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff3c3b77d4097cbe19d4371f4bcdc9db767770346cf742e1a2d403dc3537b2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.598955 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.618494 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a51f0b9b-8269-4d6f-b5ec-55870461ebe1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://287123147f1562edac9044a1b110e49241e677f8df9ac9987c0865d0d202f9de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b8def930722eb9a717d4c8945163ad1313a7912ac511b4a4c66b942a012afb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6993537ec825ca91163121a75d7129394b78f55140f7e4f96339a6609486f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1f372f877ab95f8f70e851c79fe11b00054c4c5adedfd712c50a68f2379fbb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.642229 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.642289 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.642307 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.642332 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.642351 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:06Z","lastTransitionTime":"2025-12-02T19:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.643236 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"932e803a-d171-4e99-b9bc-4776e51bfc97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T19:58:04Z\\\",\\\"message\\\":\\\"file observer\\\\nW1202 19:58:03.589134 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 19:58:03.589344 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 19:58:03.590836 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-756201955/tls.crt::/tmp/serving-cert-756201955/tls.key\\\\\\\"\\\\nI1202 19:58:03.974622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 19:58:03.978794 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 19:58:03.978823 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 19:58:03.978850 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 19:58:03.978856 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 19:58:03.988049 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 19:58:03.988074 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988080 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 19:58:03.988084 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 19:58:03.988087 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 19:58:03.988090 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 19:58:03.988093 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 19:58:03.988118 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 19:58:03.989001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.660233 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b7144de0d685f1c2eae3051fb3d1772acdb7b980fff783b20df0f1b78bf17d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a14056a956fbc47b2d620874a46906895cfa3777a2656fc665f701758a2370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.694743 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798a6158-a963-43b4-941e-ac4f3df2f883\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:59:05Z\\\",\\\"message\\\":\\\"nformers/factory.go:160\\\\nI1202 19:59:04.844566 6824 services_controller.go:614] Adding service openshift-kube-controller-manager-operator/metrics for network=default\\\\nI1202 19:59:04.844572 6824 services_controller.go:614] Adding service openshift-machine-config-operator/machine-config-controller for network=default\\\\nI1202 19:59:04.844562 6824 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:59:04.844589 6824 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 19:59:04.844600 6824 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 19:59:04.844611 6824 services_controller.go:614] Adding service openshift-console/console for network=default\\\\nI1202 19:59:04.844901 6824 factory.go:656] Stopping watch factory\\\\nI1202 19:59:04.844993 6824 handler.go:208] Removed *v1.Node event handler 2\\\\nI1202 19:59:04.845371 6824 ovnkube.go:599] Stopped ovnkube\\\\nI1202 19:59:04.845436 6824 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 19:59:04.845579 6824 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:59:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdj78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5plsn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.711660 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4aed9271-ad06-407e-b805-80c5dfea98ce\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af107daa8dda4e94d14591de18736f03ce198a036784762e05b7ccde8703eadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j9jxp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-wb7h5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.729146 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.745709 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.745808 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.745827 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.745857 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.745874 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:06Z","lastTransitionTime":"2025-12-02T19:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.756105 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600edb6b-1fb6-4946-9d09-a8e5c94045b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea527c2421c886edc8de18f7d7878e791dc87987239d950229d1e6463c9fcfbb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14ceb3afc0451006258cecd2e8fc4fa230b82eaef216e554c5a98a13aa01a6c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://917880a7d8ad9ebea6c0eec7e6f8c512c00647ae562283e4952572c893559a31\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://066f4088488bd153f74e22915cdb53d81f3d81ca7fbf0aeba7328f685560d4af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39389648f55d7f798acc0b2be87881e6d4d3d81589a7c1165965c285e5bbdb32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7f605e8265d89a9841715bc000521e06337b12c2d0451bf547b691db86a0656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://541386bb8d3cfcf122170846570de8b8f5926b407a6d10d4cad37f0c8fb7dc21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:58:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lkxb2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nxxz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.774089 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-f5x8r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a909a25-5ede-458e-af78-4a41b79716a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3d0b1c29f6c59ee7f954123d78e57d3bc99b5d9618ebbf455d6b4cb71a08d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T19:58:52Z\\\",\\\"message\\\":\\\"2025-12-02T19:58:06+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d6c3df64-c664-42d0-aca5-347b9f685d72\\\\n2025-12-02T19:58:06+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d6c3df64-c664-42d0-aca5-347b9f685d72 to /host/opt/cni/bin/\\\\n2025-12-02T19:58:06Z [verbose] multus-daemon started\\\\n2025-12-02T19:58:06Z [verbose] Readiness Indicator file check\\\\n2025-12-02T19:58:51Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7dxxc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-f5x8r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.790657 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4jpt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"980d1fa3-8069-42bc-9510-44148f64cab6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53148732bccc764ffa4dabaf31930338fd7e36cf0830daeef7cb57dda51c37b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5v5t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4jpt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.808275 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6be54112-e133-4b67-8246-3d860fce0d2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04eb945d2536ea51ebaef74709bd29f2d565e9c82856a3e63e9992b0988d1661\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57f9ecdfc7bfd82e4c9434b1f46cac3a90bfc16c90aa74fd732619605337f937\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://733cc13104aa483af9a31e581743d55acfaf15ea08ce930521fb3142b4e0c9d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.822996 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32300666-1489-4630-a690-2f438df4750f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:57:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://70220dddbe9bd2bfff2d1e664146673a1d2f33b9dc08f717241446e2baa60d79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:57:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b067e9a8d2652bfdd33e8e62427cfc1d871c6d170422a45d5676fbc4cd415f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b067e9a8d2652bfdd33e8e62427cfc1d871c6d170422a45d5676fbc4cd415f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T19:57:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T19:57:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:57:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.843317 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.848011 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.848099 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.848120 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.848151 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.848171 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:06Z","lastTransitionTime":"2025-12-02T19:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.860033 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df96b0aff189fe5c6befeefa1b96c77776a986fa21527141f52221f80f493ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.876225 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bpjf4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"834825ad-b4fa-4449-92c6-4299aecbaaec\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1b567abb1b3749e8fdbc32d0f6e69ecce171c9afc657b02df3e3246714e64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwnc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:05Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bpjf4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.894122 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af96502b-9dc2-4b45-8099-20f6fd28df4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c46fe6ed5b39d7061be6cc035f987b35f4c2e8efd8c958942903cd2ccd21ae4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecaa8988ab1f5c2702147a6aee69a4ccdc28be18c5ec72fc0c0a3fc4c9d319cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T19:58:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nzfn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-ht6kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.911779 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cb49a08-30b0-4353-ad4a-23362f281475\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T19:58:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdm6j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T19:58:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7z9t6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:06Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.952051 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.952100 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.952111 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.952135 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:06 crc kubenswrapper[4807]: I1202 19:59:06.952152 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:06Z","lastTransitionTime":"2025-12-02T19:59:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.056240 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.056328 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.056348 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.056382 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.056402 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:07Z","lastTransitionTime":"2025-12-02T19:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.159970 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.160025 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.160038 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.160057 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.160072 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:07Z","lastTransitionTime":"2025-12-02T19:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.263778 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.263851 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.263870 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.263897 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.263921 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:07Z","lastTransitionTime":"2025-12-02T19:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.367448 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.367490 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.367500 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.367515 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.367525 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:07Z","lastTransitionTime":"2025-12-02T19:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.470170 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.470233 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.470246 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.470265 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.470279 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:07Z","lastTransitionTime":"2025-12-02T19:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.573524 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.573588 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.573609 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.573637 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.573655 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:07Z","lastTransitionTime":"2025-12-02T19:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.677757 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.677800 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.677812 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.677828 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.677839 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:07Z","lastTransitionTime":"2025-12-02T19:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.781428 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.781534 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.781553 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.781579 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.781598 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:07Z","lastTransitionTime":"2025-12-02T19:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.885184 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.885258 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.885277 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.885304 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.885326 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:07Z","lastTransitionTime":"2025-12-02T19:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.955878 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.956097 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.956167 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:07 crc kubenswrapper[4807]: E1202 19:59:07.956250 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:11.956194017 +0000 UTC m=+147.257101552 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 19:59:07 crc kubenswrapper[4807]: E1202 19:59:07.956368 4807 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 19:59:07 crc kubenswrapper[4807]: E1202 19:59:07.956464 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 20:00:11.956434712 +0000 UTC m=+147.257342247 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.956486 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:07 crc kubenswrapper[4807]: E1202 19:59:07.956619 4807 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 19:59:07 crc kubenswrapper[4807]: E1202 19:59:07.956823 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 20:00:11.956791611 +0000 UTC m=+147.257699136 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 19:59:07 crc kubenswrapper[4807]: E1202 19:59:07.956989 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 19:59:07 crc kubenswrapper[4807]: E1202 19:59:07.957026 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 19:59:07 crc kubenswrapper[4807]: E1202 19:59:07.957054 4807 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:59:07 crc kubenswrapper[4807]: E1202 19:59:07.957183 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 20:00:11.957160739 +0000 UTC m=+147.258068274 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.971935 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.972062 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.972119 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.972063 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:07 crc kubenswrapper[4807]: E1202 19:59:07.972343 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:07 crc kubenswrapper[4807]: E1202 19:59:07.972538 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:07 crc kubenswrapper[4807]: E1202 19:59:07.972772 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:07 crc kubenswrapper[4807]: E1202 19:59:07.972934 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.988790 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.988893 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.988907 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.988928 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:07 crc kubenswrapper[4807]: I1202 19:59:07.988946 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:07Z","lastTransitionTime":"2025-12-02T19:59:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.058053 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:08 crc kubenswrapper[4807]: E1202 19:59:08.058284 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 19:59:08 crc kubenswrapper[4807]: E1202 19:59:08.058368 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 19:59:08 crc kubenswrapper[4807]: E1202 19:59:08.058387 4807 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:59:08 crc kubenswrapper[4807]: E1202 19:59:08.058466 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 20:00:12.058444689 +0000 UTC m=+147.359352194 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.092165 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.092216 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.092227 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.092247 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.092261 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:08Z","lastTransitionTime":"2025-12-02T19:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.194614 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.194673 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.194685 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.194707 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.194742 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:08Z","lastTransitionTime":"2025-12-02T19:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.297569 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.297620 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.297638 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.297661 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.297678 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:08Z","lastTransitionTime":"2025-12-02T19:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.400899 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.400947 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.400959 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.400976 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.400988 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:08Z","lastTransitionTime":"2025-12-02T19:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.503824 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.503872 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.503881 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.503895 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.503905 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:08Z","lastTransitionTime":"2025-12-02T19:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.607276 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.607338 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.607354 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.607376 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.607392 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:08Z","lastTransitionTime":"2025-12-02T19:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.711083 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.711165 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.711187 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.711219 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.711244 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:08Z","lastTransitionTime":"2025-12-02T19:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.815464 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.815538 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.815555 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.815580 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.815602 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:08Z","lastTransitionTime":"2025-12-02T19:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.919160 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.919230 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.919247 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.919274 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:08 crc kubenswrapper[4807]: I1202 19:59:08.919291 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:08Z","lastTransitionTime":"2025-12-02T19:59:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.023706 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.023837 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.023861 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.023892 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.023916 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:09Z","lastTransitionTime":"2025-12-02T19:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.126892 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.126957 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.126977 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.127007 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.127027 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:09Z","lastTransitionTime":"2025-12-02T19:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.231842 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.232518 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.232541 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.232569 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.232590 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:09Z","lastTransitionTime":"2025-12-02T19:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.335994 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.336059 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.336074 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.336097 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.336114 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:09Z","lastTransitionTime":"2025-12-02T19:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.439991 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.440067 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.440088 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.440116 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.440136 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:09Z","lastTransitionTime":"2025-12-02T19:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.543305 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.543404 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.543431 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.543467 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.543492 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:09Z","lastTransitionTime":"2025-12-02T19:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.622925 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.623017 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.623043 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.623076 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.623095 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:09Z","lastTransitionTime":"2025-12-02T19:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:09 crc kubenswrapper[4807]: E1202 19:59:09.646528 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.651684 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.651851 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.651878 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.651907 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.651934 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:09Z","lastTransitionTime":"2025-12-02T19:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:09 crc kubenswrapper[4807]: E1202 19:59:09.674491 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.679784 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.679855 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.679880 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.679913 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.679938 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:09Z","lastTransitionTime":"2025-12-02T19:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:09 crc kubenswrapper[4807]: E1202 19:59:09.701410 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.706490 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.706555 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.706576 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.706602 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.706619 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:09Z","lastTransitionTime":"2025-12-02T19:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:09 crc kubenswrapper[4807]: E1202 19:59:09.726678 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.731601 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.731654 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.731663 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.731680 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.731694 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:09Z","lastTransitionTime":"2025-12-02T19:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:09 crc kubenswrapper[4807]: E1202 19:59:09.750142 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T19:59:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b38988ab-e6bd-44f1-a049-4d7d2ffee59a\\\",\\\"systemUUID\\\":\\\"4d376703-634d-4ff9-8cdc-7b05f903cec2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T19:59:09Z is after 2025-08-24T17:21:41Z" Dec 02 19:59:09 crc kubenswrapper[4807]: E1202 19:59:09.750297 4807 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.751990 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.752025 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.752038 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.752056 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.752069 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:09Z","lastTransitionTime":"2025-12-02T19:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.855369 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.855439 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.855455 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.855483 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.855500 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:09Z","lastTransitionTime":"2025-12-02T19:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.958250 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.958307 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.958325 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.958349 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.958369 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:09Z","lastTransitionTime":"2025-12-02T19:59:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.971874 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.971991 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.971991 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:09 crc kubenswrapper[4807]: E1202 19:59:09.972089 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:09 crc kubenswrapper[4807]: E1202 19:59:09.972220 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:09 crc kubenswrapper[4807]: I1202 19:59:09.972292 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:09 crc kubenswrapper[4807]: E1202 19:59:09.972455 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:09 crc kubenswrapper[4807]: E1202 19:59:09.972536 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.061756 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.061828 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.061843 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.061868 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.061888 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:10Z","lastTransitionTime":"2025-12-02T19:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.164388 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.164440 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.164449 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.164465 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.164476 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:10Z","lastTransitionTime":"2025-12-02T19:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.267669 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.267729 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.267740 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.267755 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.267765 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:10Z","lastTransitionTime":"2025-12-02T19:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.370658 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.370740 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.370757 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.370780 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.370796 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:10Z","lastTransitionTime":"2025-12-02T19:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.474216 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.474276 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.474293 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.474318 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.474335 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:10Z","lastTransitionTime":"2025-12-02T19:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.578013 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.578085 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.578107 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.578142 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.578164 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:10Z","lastTransitionTime":"2025-12-02T19:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.681492 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.681558 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.681579 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.681597 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.681610 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:10Z","lastTransitionTime":"2025-12-02T19:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.784476 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.784558 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.784582 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.784614 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.784634 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:10Z","lastTransitionTime":"2025-12-02T19:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.888596 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.888700 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.888758 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.888790 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.888812 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:10Z","lastTransitionTime":"2025-12-02T19:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.992515 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.992592 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.992610 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.992634 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:10 crc kubenswrapper[4807]: I1202 19:59:10.992655 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:10Z","lastTransitionTime":"2025-12-02T19:59:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.096753 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.096830 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.096852 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.096876 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.096894 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:11Z","lastTransitionTime":"2025-12-02T19:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.199327 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.199365 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.199372 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.199385 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.199393 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:11Z","lastTransitionTime":"2025-12-02T19:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.302143 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.302188 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.302204 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.302225 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.302243 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:11Z","lastTransitionTime":"2025-12-02T19:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.405588 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.405651 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.405662 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.405686 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.405700 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:11Z","lastTransitionTime":"2025-12-02T19:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.509321 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.509404 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.509429 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.509462 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.509482 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:11Z","lastTransitionTime":"2025-12-02T19:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.613010 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.613073 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.613097 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.613126 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.613148 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:11Z","lastTransitionTime":"2025-12-02T19:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.716586 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.716657 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.716676 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.716700 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.716768 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:11Z","lastTransitionTime":"2025-12-02T19:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.820295 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.820345 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.820360 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.820382 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.820399 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:11Z","lastTransitionTime":"2025-12-02T19:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.922968 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.923027 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.923040 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.923060 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.923074 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:11Z","lastTransitionTime":"2025-12-02T19:59:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.971415 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.971459 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.971459 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:11 crc kubenswrapper[4807]: I1202 19:59:11.971422 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:11 crc kubenswrapper[4807]: E1202 19:59:11.971577 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:11 crc kubenswrapper[4807]: E1202 19:59:11.971689 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:11 crc kubenswrapper[4807]: E1202 19:59:11.971837 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:11 crc kubenswrapper[4807]: E1202 19:59:11.971918 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.025706 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.025794 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.025814 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.025834 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.025847 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:12Z","lastTransitionTime":"2025-12-02T19:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.128348 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.128421 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.128446 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.128479 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.128503 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:12Z","lastTransitionTime":"2025-12-02T19:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.231062 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.231136 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.231156 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.231186 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.231232 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:12Z","lastTransitionTime":"2025-12-02T19:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.334134 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.334214 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.334251 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.334279 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.334300 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:12Z","lastTransitionTime":"2025-12-02T19:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.436666 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.436788 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.436823 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.436853 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.436875 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:12Z","lastTransitionTime":"2025-12-02T19:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.540012 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.540105 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.540122 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.540171 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.540193 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:12Z","lastTransitionTime":"2025-12-02T19:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.644022 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.644088 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.644099 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.644115 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.644128 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:12Z","lastTransitionTime":"2025-12-02T19:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.747452 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.747505 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.747521 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.747542 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.747558 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:12Z","lastTransitionTime":"2025-12-02T19:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.850133 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.850223 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.850242 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.850281 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.850310 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:12Z","lastTransitionTime":"2025-12-02T19:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.953273 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.953389 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.953410 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.953437 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:12 crc kubenswrapper[4807]: I1202 19:59:12.953456 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:12Z","lastTransitionTime":"2025-12-02T19:59:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.056662 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.056712 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.056742 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.056761 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.056772 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:13Z","lastTransitionTime":"2025-12-02T19:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.160018 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.160079 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.160092 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.160111 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.160124 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:13Z","lastTransitionTime":"2025-12-02T19:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.263647 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.263704 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.263759 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.263784 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.263801 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:13Z","lastTransitionTime":"2025-12-02T19:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.366466 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.366546 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.366567 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.366593 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.366622 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:13Z","lastTransitionTime":"2025-12-02T19:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.469942 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.470037 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.470071 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.470103 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.470126 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:13Z","lastTransitionTime":"2025-12-02T19:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.573481 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.573543 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.573555 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.573576 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.573589 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:13Z","lastTransitionTime":"2025-12-02T19:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.681587 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.681643 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.681657 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.681677 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.681690 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:13Z","lastTransitionTime":"2025-12-02T19:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.784863 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.784931 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.784949 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.784969 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.784985 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:13Z","lastTransitionTime":"2025-12-02T19:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.888573 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.888646 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.888662 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.888686 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.888702 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:13Z","lastTransitionTime":"2025-12-02T19:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.971574 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.971758 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.971793 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:13 crc kubenswrapper[4807]: E1202 19:59:13.971862 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:13 crc kubenswrapper[4807]: E1202 19:59:13.972004 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:13 crc kubenswrapper[4807]: E1202 19:59:13.972160 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.972190 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:13 crc kubenswrapper[4807]: E1202 19:59:13.972290 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.992258 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.992343 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.992367 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.992398 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:13 crc kubenswrapper[4807]: I1202 19:59:13.992421 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:13Z","lastTransitionTime":"2025-12-02T19:59:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.096248 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.096318 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.096335 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.096364 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.096440 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:14Z","lastTransitionTime":"2025-12-02T19:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.199817 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.199914 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.199949 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.199989 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.200014 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:14Z","lastTransitionTime":"2025-12-02T19:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.303119 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.303197 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.303216 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.303241 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.303259 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:14Z","lastTransitionTime":"2025-12-02T19:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.407424 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.407481 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.407497 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.407517 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.407531 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:14Z","lastTransitionTime":"2025-12-02T19:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.510511 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.510626 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.510651 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.510679 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.510701 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:14Z","lastTransitionTime":"2025-12-02T19:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.614225 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.614277 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.614286 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.614302 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.614316 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:14Z","lastTransitionTime":"2025-12-02T19:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.718012 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.718104 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.718123 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.718149 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.718169 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:14Z","lastTransitionTime":"2025-12-02T19:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.821812 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.821888 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.821911 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.821935 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.821954 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:14Z","lastTransitionTime":"2025-12-02T19:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.925990 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.926058 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.926078 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.926105 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.926128 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:14Z","lastTransitionTime":"2025-12-02T19:59:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:14 crc kubenswrapper[4807]: I1202 19:59:14.995485 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bpjf4" podStartSLOduration=70.995465484 podStartE2EDuration="1m10.995465484s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 19:59:14.995464324 +0000 UTC m=+90.296371839" watchObservedRunningTime="2025-12-02 19:59:14.995465484 +0000 UTC m=+90.296372979" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.029981 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.030051 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.030071 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.030096 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.030119 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:15Z","lastTransitionTime":"2025-12-02T19:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.034328 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-f5x8r" podStartSLOduration=71.034302977 podStartE2EDuration="1m11.034302977s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 19:59:15.019380396 +0000 UTC m=+90.320287931" watchObservedRunningTime="2025-12-02 19:59:15.034302977 +0000 UTC m=+90.335210512" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.052448 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4jpt9" podStartSLOduration=71.052416562 podStartE2EDuration="1m11.052416562s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 19:59:15.034913281 +0000 UTC m=+90.335820806" watchObservedRunningTime="2025-12-02 19:59:15.052416562 +0000 UTC m=+90.353324097" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.069287 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=68.069256278 podStartE2EDuration="1m8.069256278s" podCreationTimestamp="2025-12-02 19:58:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 19:59:15.052006313 +0000 UTC m=+90.352913838" watchObservedRunningTime="2025-12-02 19:59:15.069256278 +0000 UTC m=+90.370163823" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.069525 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=13.069514374 podStartE2EDuration="13.069514374s" podCreationTimestamp="2025-12-02 19:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 19:59:15.06847635 +0000 UTC m=+90.369383915" watchObservedRunningTime="2025-12-02 19:59:15.069514374 +0000 UTC m=+90.370421909" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.118036 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-ht6kp" podStartSLOduration=70.118015504 podStartE2EDuration="1m10.118015504s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 19:59:15.117837489 +0000 UTC m=+90.418744994" watchObservedRunningTime="2025-12-02 19:59:15.118015504 +0000 UTC m=+90.418923009" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.132219 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.132279 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.132296 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.132321 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.132343 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:15Z","lastTransitionTime":"2025-12-02T19:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.235939 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.235983 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.235993 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.236012 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.236025 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:15Z","lastTransitionTime":"2025-12-02T19:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.243054 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=39.243039731 podStartE2EDuration="39.243039731s" podCreationTimestamp="2025-12-02 19:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 19:59:15.19745299 +0000 UTC m=+90.498360495" watchObservedRunningTime="2025-12-02 19:59:15.243039731 +0000 UTC m=+90.543947226" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.243442 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=71.24343775 podStartE2EDuration="1m11.24343775s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 19:59:15.24298672 +0000 UTC m=+90.543894225" watchObservedRunningTime="2025-12-02 19:59:15.24343775 +0000 UTC m=+90.544345245" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.301677 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podStartSLOduration=71.301651378 podStartE2EDuration="1m11.301651378s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 19:59:15.30130731 +0000 UTC m=+90.602214815" watchObservedRunningTime="2025-12-02 19:59:15.301651378 +0000 UTC m=+90.602558873" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.338440 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.338486 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.338495 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.338510 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.338524 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:15Z","lastTransitionTime":"2025-12-02T19:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.441891 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.441976 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.442003 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.442038 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.442069 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:15Z","lastTransitionTime":"2025-12-02T19:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.545367 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.545433 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.545455 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.545479 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.545498 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:15Z","lastTransitionTime":"2025-12-02T19:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.647649 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.647699 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.647739 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.647760 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.647774 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:15Z","lastTransitionTime":"2025-12-02T19:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.750310 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.750360 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.750373 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.750389 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.750403 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:15Z","lastTransitionTime":"2025-12-02T19:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.854340 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.854409 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.854428 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.854457 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.854477 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:15Z","lastTransitionTime":"2025-12-02T19:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.958631 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.958676 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.958684 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.958700 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.958711 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:15Z","lastTransitionTime":"2025-12-02T19:59:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.971494 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.971562 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.971556 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:15 crc kubenswrapper[4807]: I1202 19:59:15.971561 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:15 crc kubenswrapper[4807]: E1202 19:59:15.972341 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:15 crc kubenswrapper[4807]: E1202 19:59:15.972508 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:15 crc kubenswrapper[4807]: E1202 19:59:15.972894 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:15 crc kubenswrapper[4807]: E1202 19:59:15.973062 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.061645 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.061706 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.061748 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.061774 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.061792 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:16Z","lastTransitionTime":"2025-12-02T19:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.164376 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.164439 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.164459 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.164484 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.164507 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:16Z","lastTransitionTime":"2025-12-02T19:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.267438 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.267535 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.267562 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.267595 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.267622 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:16Z","lastTransitionTime":"2025-12-02T19:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.370758 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.370828 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.370850 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.370877 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.370898 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:16Z","lastTransitionTime":"2025-12-02T19:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.473741 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.473804 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.473816 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.473837 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.473850 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:16Z","lastTransitionTime":"2025-12-02T19:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.576755 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.576831 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.576862 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.576898 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.576922 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:16Z","lastTransitionTime":"2025-12-02T19:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.679514 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.679600 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.679626 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.679656 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.679682 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:16Z","lastTransitionTime":"2025-12-02T19:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.783597 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.783672 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.783694 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.783762 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.783788 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:16Z","lastTransitionTime":"2025-12-02T19:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.886958 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.887052 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.887072 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.887101 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.887120 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:16Z","lastTransitionTime":"2025-12-02T19:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.990052 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.990144 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.990171 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.990202 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:16 crc kubenswrapper[4807]: I1202 19:59:16.990225 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:16Z","lastTransitionTime":"2025-12-02T19:59:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.093939 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.094019 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.094039 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.094068 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.094088 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:17Z","lastTransitionTime":"2025-12-02T19:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.197196 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.197240 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.197249 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.197271 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.197285 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:17Z","lastTransitionTime":"2025-12-02T19:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.300681 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.300794 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.300823 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.300859 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.300883 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:17Z","lastTransitionTime":"2025-12-02T19:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.404548 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.404657 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.404683 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.404713 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.404769 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:17Z","lastTransitionTime":"2025-12-02T19:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.508475 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.508544 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.508557 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.508582 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.508627 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:17Z","lastTransitionTime":"2025-12-02T19:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.612099 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.612163 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.612182 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.612209 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.612224 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:17Z","lastTransitionTime":"2025-12-02T19:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.716007 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.716075 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.716096 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.716124 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.716145 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:17Z","lastTransitionTime":"2025-12-02T19:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.819071 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.819144 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.819161 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.819189 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.819207 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:17Z","lastTransitionTime":"2025-12-02T19:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.923385 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.923453 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.923472 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.923499 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.923517 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:17Z","lastTransitionTime":"2025-12-02T19:59:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.972033 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.972180 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.972180 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.972256 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:17 crc kubenswrapper[4807]: E1202 19:59:17.972410 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:17 crc kubenswrapper[4807]: E1202 19:59:17.972799 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:17 crc kubenswrapper[4807]: E1202 19:59:17.973286 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:17 crc kubenswrapper[4807]: E1202 19:59:17.973687 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:17 crc kubenswrapper[4807]: I1202 19:59:17.974089 4807 scope.go:117] "RemoveContainer" containerID="4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c" Dec 02 19:59:17 crc kubenswrapper[4807]: E1202 19:59:17.974380 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.026549 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.026585 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.026593 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.026607 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.026615 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:18Z","lastTransitionTime":"2025-12-02T19:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.130107 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.130189 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.130212 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.130243 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.130271 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:18Z","lastTransitionTime":"2025-12-02T19:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.233170 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.233239 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.233273 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.233301 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.233324 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:18Z","lastTransitionTime":"2025-12-02T19:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.337091 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.337158 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.337175 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.337201 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.337220 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:18Z","lastTransitionTime":"2025-12-02T19:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.439960 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.440004 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.440016 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.440033 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.440044 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:18Z","lastTransitionTime":"2025-12-02T19:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.543008 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.543068 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.543077 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.543098 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.543110 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:18Z","lastTransitionTime":"2025-12-02T19:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.647256 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.647328 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.647350 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.647385 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.647412 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:18Z","lastTransitionTime":"2025-12-02T19:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.750464 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.750559 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.750578 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.750610 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.750636 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:18Z","lastTransitionTime":"2025-12-02T19:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.854364 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.854439 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.854455 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.854480 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.854498 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:18Z","lastTransitionTime":"2025-12-02T19:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.958563 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.958635 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.958655 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.958685 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:18 crc kubenswrapper[4807]: I1202 19:59:18.958706 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:18Z","lastTransitionTime":"2025-12-02T19:59:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.061757 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.061847 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.061872 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.061911 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.061949 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:19Z","lastTransitionTime":"2025-12-02T19:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.165296 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.165366 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.165384 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.165416 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.165434 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:19Z","lastTransitionTime":"2025-12-02T19:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.269205 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.269273 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.269290 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.269315 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.269333 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:19Z","lastTransitionTime":"2025-12-02T19:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.372821 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.372902 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.372919 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.372940 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.372956 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:19Z","lastTransitionTime":"2025-12-02T19:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.476950 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.477033 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.477048 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.477069 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.477085 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:19Z","lastTransitionTime":"2025-12-02T19:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.579765 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.579841 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.579859 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.579885 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.579902 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:19Z","lastTransitionTime":"2025-12-02T19:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.683274 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.683317 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.683329 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.683347 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.683361 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:19Z","lastTransitionTime":"2025-12-02T19:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.786040 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.786089 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.786098 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.786115 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.786128 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:19Z","lastTransitionTime":"2025-12-02T19:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.889335 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.889974 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.889996 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.890025 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.890044 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:19Z","lastTransitionTime":"2025-12-02T19:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.971565 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:19 crc kubenswrapper[4807]: E1202 19:59:19.971853 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.972158 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.972351 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:19 crc kubenswrapper[4807]: E1202 19:59:19.972586 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.972665 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:19 crc kubenswrapper[4807]: E1202 19:59:19.972926 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:19 crc kubenswrapper[4807]: E1202 19:59:19.973049 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.973312 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.973373 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.973391 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.973413 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.973428 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:19Z","lastTransitionTime":"2025-12-02T19:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.998333 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.998395 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.998412 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.998437 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 19:59:19 crc kubenswrapper[4807]: I1202 19:59:19.998457 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T19:59:19Z","lastTransitionTime":"2025-12-02T19:59:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.029535 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nxxz4" podStartSLOduration=76.029506269 podStartE2EDuration="1m16.029506269s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 19:59:15.332605705 +0000 UTC m=+90.633513200" watchObservedRunningTime="2025-12-02 19:59:20.029506269 +0000 UTC m=+95.330413804" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.030516 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk"] Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.031077 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.033314 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.033388 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.033405 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.034857 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.104510 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c4fc1239-509a-45b1-aa52-369ceb4c06b3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-z6dqk\" (UID: \"c4fc1239-509a-45b1-aa52-369ceb4c06b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.104574 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c4fc1239-509a-45b1-aa52-369ceb4c06b3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-z6dqk\" (UID: \"c4fc1239-509a-45b1-aa52-369ceb4c06b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.104618 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4fc1239-509a-45b1-aa52-369ceb4c06b3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-z6dqk\" (UID: \"c4fc1239-509a-45b1-aa52-369ceb4c06b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.104790 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4fc1239-509a-45b1-aa52-369ceb4c06b3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-z6dqk\" (UID: \"c4fc1239-509a-45b1-aa52-369ceb4c06b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.104841 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4fc1239-509a-45b1-aa52-369ceb4c06b3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-z6dqk\" (UID: \"c4fc1239-509a-45b1-aa52-369ceb4c06b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.205607 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4fc1239-509a-45b1-aa52-369ceb4c06b3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-z6dqk\" (UID: \"c4fc1239-509a-45b1-aa52-369ceb4c06b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.205658 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4fc1239-509a-45b1-aa52-369ceb4c06b3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-z6dqk\" (UID: \"c4fc1239-509a-45b1-aa52-369ceb4c06b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.205710 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c4fc1239-509a-45b1-aa52-369ceb4c06b3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-z6dqk\" (UID: \"c4fc1239-509a-45b1-aa52-369ceb4c06b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.205758 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c4fc1239-509a-45b1-aa52-369ceb4c06b3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-z6dqk\" (UID: \"c4fc1239-509a-45b1-aa52-369ceb4c06b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.205788 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4fc1239-509a-45b1-aa52-369ceb4c06b3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-z6dqk\" (UID: \"c4fc1239-509a-45b1-aa52-369ceb4c06b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.206403 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/c4fc1239-509a-45b1-aa52-369ceb4c06b3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-z6dqk\" (UID: \"c4fc1239-509a-45b1-aa52-369ceb4c06b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.206483 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/c4fc1239-509a-45b1-aa52-369ceb4c06b3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-z6dqk\" (UID: \"c4fc1239-509a-45b1-aa52-369ceb4c06b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.206857 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4fc1239-509a-45b1-aa52-369ceb4c06b3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-z6dqk\" (UID: \"c4fc1239-509a-45b1-aa52-369ceb4c06b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.216564 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4fc1239-509a-45b1-aa52-369ceb4c06b3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-z6dqk\" (UID: \"c4fc1239-509a-45b1-aa52-369ceb4c06b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.227897 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c4fc1239-509a-45b1-aa52-369ceb4c06b3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-z6dqk\" (UID: \"c4fc1239-509a-45b1-aa52-369ceb4c06b3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.345235 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" Dec 02 19:59:20 crc kubenswrapper[4807]: W1202 19:59:20.363307 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4fc1239_509a_45b1_aa52_369ceb4c06b3.slice/crio-ba54cf764a8d53a1f4ec2ee45afdb19d96c3da3b72e09dbcbab72916ab410340 WatchSource:0}: Error finding container ba54cf764a8d53a1f4ec2ee45afdb19d96c3da3b72e09dbcbab72916ab410340: Status 404 returned error can't find the container with id ba54cf764a8d53a1f4ec2ee45afdb19d96c3da3b72e09dbcbab72916ab410340 Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.605813 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" event={"ID":"c4fc1239-509a-45b1-aa52-369ceb4c06b3","Type":"ContainerStarted","Data":"84839422c309af769cdeb6aa1d95e711ae983669c90e9ebddec1cfd2908a36b6"} Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.605893 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" event={"ID":"c4fc1239-509a-45b1-aa52-369ceb4c06b3","Type":"ContainerStarted","Data":"ba54cf764a8d53a1f4ec2ee45afdb19d96c3da3b72e09dbcbab72916ab410340"} Dec 02 19:59:20 crc kubenswrapper[4807]: I1202 19:59:20.629252 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-z6dqk" podStartSLOduration=76.629230828 podStartE2EDuration="1m16.629230828s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 19:59:20.627687282 +0000 UTC m=+95.928594857" watchObservedRunningTime="2025-12-02 19:59:20.629230828 +0000 UTC m=+95.930138323" Dec 02 19:59:21 crc kubenswrapper[4807]: I1202 19:59:21.971973 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:21 crc kubenswrapper[4807]: I1202 19:59:21.972035 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:21 crc kubenswrapper[4807]: I1202 19:59:21.972084 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:21 crc kubenswrapper[4807]: I1202 19:59:21.972034 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:21 crc kubenswrapper[4807]: E1202 19:59:21.972158 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:21 crc kubenswrapper[4807]: E1202 19:59:21.972224 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:21 crc kubenswrapper[4807]: E1202 19:59:21.972423 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:21 crc kubenswrapper[4807]: E1202 19:59:21.972511 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:23 crc kubenswrapper[4807]: I1202 19:59:23.136536 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs\") pod \"network-metrics-daemon-7z9t6\" (UID: \"1cb49a08-30b0-4353-ad4a-23362f281475\") " pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:23 crc kubenswrapper[4807]: E1202 19:59:23.136736 4807 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 19:59:23 crc kubenswrapper[4807]: E1202 19:59:23.136965 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs podName:1cb49a08-30b0-4353-ad4a-23362f281475 nodeName:}" failed. No retries permitted until 2025-12-02 20:00:27.136944697 +0000 UTC m=+162.437852192 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs") pod "network-metrics-daemon-7z9t6" (UID: "1cb49a08-30b0-4353-ad4a-23362f281475") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 19:59:23 crc kubenswrapper[4807]: I1202 19:59:23.972071 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:23 crc kubenswrapper[4807]: I1202 19:59:23.972171 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:23 crc kubenswrapper[4807]: E1202 19:59:23.972309 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:23 crc kubenswrapper[4807]: I1202 19:59:23.972610 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:23 crc kubenswrapper[4807]: I1202 19:59:23.972662 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:23 crc kubenswrapper[4807]: E1202 19:59:23.972817 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:23 crc kubenswrapper[4807]: E1202 19:59:23.973054 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:23 crc kubenswrapper[4807]: E1202 19:59:23.973238 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:25 crc kubenswrapper[4807]: I1202 19:59:25.972208 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:25 crc kubenswrapper[4807]: I1202 19:59:25.972246 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:25 crc kubenswrapper[4807]: I1202 19:59:25.972940 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:25 crc kubenswrapper[4807]: I1202 19:59:25.973179 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:25 crc kubenswrapper[4807]: E1202 19:59:25.973249 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:25 crc kubenswrapper[4807]: E1202 19:59:25.973096 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:25 crc kubenswrapper[4807]: E1202 19:59:25.973392 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:25 crc kubenswrapper[4807]: E1202 19:59:25.973523 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:27 crc kubenswrapper[4807]: I1202 19:59:27.971322 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:27 crc kubenswrapper[4807]: I1202 19:59:27.971392 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:27 crc kubenswrapper[4807]: E1202 19:59:27.971491 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:27 crc kubenswrapper[4807]: I1202 19:59:27.971524 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:27 crc kubenswrapper[4807]: I1202 19:59:27.971583 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:27 crc kubenswrapper[4807]: E1202 19:59:27.971956 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:27 crc kubenswrapper[4807]: E1202 19:59:27.971768 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:27 crc kubenswrapper[4807]: E1202 19:59:27.972061 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:29 crc kubenswrapper[4807]: I1202 19:59:29.971456 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:29 crc kubenswrapper[4807]: I1202 19:59:29.971474 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:29 crc kubenswrapper[4807]: E1202 19:59:29.972212 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:29 crc kubenswrapper[4807]: I1202 19:59:29.971459 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:29 crc kubenswrapper[4807]: I1202 19:59:29.971472 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:29 crc kubenswrapper[4807]: E1202 19:59:29.972326 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:29 crc kubenswrapper[4807]: E1202 19:59:29.972387 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:29 crc kubenswrapper[4807]: E1202 19:59:29.972460 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:30 crc kubenswrapper[4807]: I1202 19:59:30.973187 4807 scope.go:117] "RemoveContainer" containerID="4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c" Dec 02 19:59:30 crc kubenswrapper[4807]: E1202 19:59:30.973479 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" Dec 02 19:59:31 crc kubenswrapper[4807]: I1202 19:59:31.971768 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:31 crc kubenswrapper[4807]: I1202 19:59:31.971861 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:31 crc kubenswrapper[4807]: I1202 19:59:31.971994 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:31 crc kubenswrapper[4807]: E1202 19:59:31.971995 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:31 crc kubenswrapper[4807]: I1202 19:59:31.972119 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:31 crc kubenswrapper[4807]: E1202 19:59:31.972349 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:31 crc kubenswrapper[4807]: E1202 19:59:31.972529 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:31 crc kubenswrapper[4807]: E1202 19:59:31.972679 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:32 crc kubenswrapper[4807]: I1202 19:59:32.997415 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 02 19:59:33 crc kubenswrapper[4807]: I1202 19:59:33.971864 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:33 crc kubenswrapper[4807]: I1202 19:59:33.971965 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:33 crc kubenswrapper[4807]: I1202 19:59:33.972005 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:33 crc kubenswrapper[4807]: I1202 19:59:33.971931 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:33 crc kubenswrapper[4807]: E1202 19:59:33.972148 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:33 crc kubenswrapper[4807]: E1202 19:59:33.972349 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:33 crc kubenswrapper[4807]: E1202 19:59:33.972521 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:33 crc kubenswrapper[4807]: E1202 19:59:33.972634 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:35 crc kubenswrapper[4807]: I1202 19:59:35.022989 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.022955206 podStartE2EDuration="3.022955206s" podCreationTimestamp="2025-12-02 19:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 19:59:35.022359892 +0000 UTC m=+110.323267427" watchObservedRunningTime="2025-12-02 19:59:35.022955206 +0000 UTC m=+110.323862741" Dec 02 19:59:35 crc kubenswrapper[4807]: I1202 19:59:35.972201 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:35 crc kubenswrapper[4807]: I1202 19:59:35.972410 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:35 crc kubenswrapper[4807]: I1202 19:59:35.972481 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:35 crc kubenswrapper[4807]: I1202 19:59:35.972614 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:35 crc kubenswrapper[4807]: E1202 19:59:35.972797 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:35 crc kubenswrapper[4807]: E1202 19:59:35.973025 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:35 crc kubenswrapper[4807]: E1202 19:59:35.973131 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:35 crc kubenswrapper[4807]: E1202 19:59:35.973262 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:37 crc kubenswrapper[4807]: I1202 19:59:37.971599 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:37 crc kubenswrapper[4807]: I1202 19:59:37.971682 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:37 crc kubenswrapper[4807]: I1202 19:59:37.971706 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:37 crc kubenswrapper[4807]: E1202 19:59:37.971869 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:37 crc kubenswrapper[4807]: I1202 19:59:37.971894 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:37 crc kubenswrapper[4807]: E1202 19:59:37.972075 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:37 crc kubenswrapper[4807]: E1202 19:59:37.972204 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:37 crc kubenswrapper[4807]: E1202 19:59:37.972299 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:39 crc kubenswrapper[4807]: I1202 19:59:39.749483 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f5x8r_8a909a25-5ede-458e-af78-4a41b79716a5/kube-multus/1.log" Dec 02 19:59:39 crc kubenswrapper[4807]: I1202 19:59:39.749929 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f5x8r_8a909a25-5ede-458e-af78-4a41b79716a5/kube-multus/0.log" Dec 02 19:59:39 crc kubenswrapper[4807]: I1202 19:59:39.749985 4807 generic.go:334] "Generic (PLEG): container finished" podID="8a909a25-5ede-458e-af78-4a41b79716a5" containerID="c3d0b1c29f6c59ee7f954123d78e57d3bc99b5d9618ebbf455d6b4cb71a08d63" exitCode=1 Dec 02 19:59:39 crc kubenswrapper[4807]: I1202 19:59:39.750040 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f5x8r" event={"ID":"8a909a25-5ede-458e-af78-4a41b79716a5","Type":"ContainerDied","Data":"c3d0b1c29f6c59ee7f954123d78e57d3bc99b5d9618ebbf455d6b4cb71a08d63"} Dec 02 19:59:39 crc kubenswrapper[4807]: I1202 19:59:39.750087 4807 scope.go:117] "RemoveContainer" containerID="2702f864913447ce887569de3ca551a1d0236cd3a4cf73005e56dd939089a828" Dec 02 19:59:39 crc kubenswrapper[4807]: I1202 19:59:39.750602 4807 scope.go:117] "RemoveContainer" containerID="c3d0b1c29f6c59ee7f954123d78e57d3bc99b5d9618ebbf455d6b4cb71a08d63" Dec 02 19:59:39 crc kubenswrapper[4807]: E1202 19:59:39.750782 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-f5x8r_openshift-multus(8a909a25-5ede-458e-af78-4a41b79716a5)\"" pod="openshift-multus/multus-f5x8r" podUID="8a909a25-5ede-458e-af78-4a41b79716a5" Dec 02 19:59:39 crc kubenswrapper[4807]: I1202 19:59:39.972427 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:39 crc kubenswrapper[4807]: I1202 19:59:39.972427 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:39 crc kubenswrapper[4807]: E1202 19:59:39.973048 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:39 crc kubenswrapper[4807]: I1202 19:59:39.972668 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:39 crc kubenswrapper[4807]: I1202 19:59:39.972455 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:39 crc kubenswrapper[4807]: E1202 19:59:39.973326 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:39 crc kubenswrapper[4807]: E1202 19:59:39.973392 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:39 crc kubenswrapper[4807]: E1202 19:59:39.973507 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:40 crc kubenswrapper[4807]: I1202 19:59:40.757238 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f5x8r_8a909a25-5ede-458e-af78-4a41b79716a5/kube-multus/1.log" Dec 02 19:59:41 crc kubenswrapper[4807]: I1202 19:59:41.971652 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:41 crc kubenswrapper[4807]: E1202 19:59:41.971893 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:41 crc kubenswrapper[4807]: I1202 19:59:41.971980 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:41 crc kubenswrapper[4807]: E1202 19:59:41.972079 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:41 crc kubenswrapper[4807]: I1202 19:59:41.972261 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:41 crc kubenswrapper[4807]: I1202 19:59:41.972331 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:41 crc kubenswrapper[4807]: E1202 19:59:41.972542 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:41 crc kubenswrapper[4807]: E1202 19:59:41.972678 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:41 crc kubenswrapper[4807]: I1202 19:59:41.973553 4807 scope.go:117] "RemoveContainer" containerID="4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c" Dec 02 19:59:41 crc kubenswrapper[4807]: E1202 19:59:41.973768 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5plsn_openshift-ovn-kubernetes(798a6158-a963-43b4-941e-ac4f3df2f883)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" Dec 02 19:59:43 crc kubenswrapper[4807]: I1202 19:59:43.971644 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:43 crc kubenswrapper[4807]: I1202 19:59:43.971645 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:43 crc kubenswrapper[4807]: I1202 19:59:43.971802 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:43 crc kubenswrapper[4807]: I1202 19:59:43.971809 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:43 crc kubenswrapper[4807]: E1202 19:59:43.971984 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:43 crc kubenswrapper[4807]: E1202 19:59:43.972133 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:43 crc kubenswrapper[4807]: E1202 19:59:43.972380 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:43 crc kubenswrapper[4807]: E1202 19:59:43.972522 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:45 crc kubenswrapper[4807]: E1202 19:59:45.014895 4807 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 02 19:59:45 crc kubenswrapper[4807]: E1202 19:59:45.166491 4807 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 19:59:45 crc kubenswrapper[4807]: I1202 19:59:45.972517 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:45 crc kubenswrapper[4807]: I1202 19:59:45.972541 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:45 crc kubenswrapper[4807]: I1202 19:59:45.972580 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:45 crc kubenswrapper[4807]: I1202 19:59:45.972747 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:45 crc kubenswrapper[4807]: E1202 19:59:45.972789 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:45 crc kubenswrapper[4807]: E1202 19:59:45.972962 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:45 crc kubenswrapper[4807]: E1202 19:59:45.973146 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:45 crc kubenswrapper[4807]: E1202 19:59:45.973245 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:47 crc kubenswrapper[4807]: I1202 19:59:47.972027 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:47 crc kubenswrapper[4807]: I1202 19:59:47.972138 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:47 crc kubenswrapper[4807]: E1202 19:59:47.972202 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:47 crc kubenswrapper[4807]: I1202 19:59:47.972138 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:47 crc kubenswrapper[4807]: I1202 19:59:47.972362 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:47 crc kubenswrapper[4807]: E1202 19:59:47.972375 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:47 crc kubenswrapper[4807]: E1202 19:59:47.972500 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:47 crc kubenswrapper[4807]: E1202 19:59:47.972571 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:49 crc kubenswrapper[4807]: I1202 19:59:49.971948 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:49 crc kubenswrapper[4807]: E1202 19:59:49.972141 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:49 crc kubenswrapper[4807]: I1202 19:59:49.972214 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:49 crc kubenswrapper[4807]: I1202 19:59:49.972309 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:49 crc kubenswrapper[4807]: E1202 19:59:49.972434 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:49 crc kubenswrapper[4807]: E1202 19:59:49.972596 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:49 crc kubenswrapper[4807]: I1202 19:59:49.972762 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:49 crc kubenswrapper[4807]: E1202 19:59:49.972884 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:50 crc kubenswrapper[4807]: E1202 19:59:50.168294 4807 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 19:59:51 crc kubenswrapper[4807]: I1202 19:59:51.972016 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:51 crc kubenswrapper[4807]: E1202 19:59:51.972422 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:51 crc kubenswrapper[4807]: I1202 19:59:51.972601 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:51 crc kubenswrapper[4807]: I1202 19:59:51.972630 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:51 crc kubenswrapper[4807]: I1202 19:59:51.972711 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:51 crc kubenswrapper[4807]: E1202 19:59:51.972867 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:51 crc kubenswrapper[4807]: E1202 19:59:51.973043 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:51 crc kubenswrapper[4807]: E1202 19:59:51.973103 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:53 crc kubenswrapper[4807]: I1202 19:59:53.971906 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:53 crc kubenswrapper[4807]: I1202 19:59:53.972009 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:53 crc kubenswrapper[4807]: I1202 19:59:53.972191 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:53 crc kubenswrapper[4807]: I1202 19:59:53.972523 4807 scope.go:117] "RemoveContainer" containerID="c3d0b1c29f6c59ee7f954123d78e57d3bc99b5d9618ebbf455d6b4cb71a08d63" Dec 02 19:59:53 crc kubenswrapper[4807]: E1202 19:59:53.972532 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:53 crc kubenswrapper[4807]: I1202 19:59:53.972612 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:53 crc kubenswrapper[4807]: E1202 19:59:53.972703 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:53 crc kubenswrapper[4807]: E1202 19:59:53.972915 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:53 crc kubenswrapper[4807]: E1202 19:59:53.972963 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:54 crc kubenswrapper[4807]: I1202 19:59:54.814277 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f5x8r_8a909a25-5ede-458e-af78-4a41b79716a5/kube-multus/1.log" Dec 02 19:59:54 crc kubenswrapper[4807]: I1202 19:59:54.814344 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f5x8r" event={"ID":"8a909a25-5ede-458e-af78-4a41b79716a5","Type":"ContainerStarted","Data":"36732a036554b4a2b688a0b85aa52399128710e2139572d5ad3f190bb95b4a72"} Dec 02 19:59:55 crc kubenswrapper[4807]: E1202 19:59:55.168870 4807 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 19:59:55 crc kubenswrapper[4807]: I1202 19:59:55.971754 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:55 crc kubenswrapper[4807]: I1202 19:59:55.971905 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:55 crc kubenswrapper[4807]: I1202 19:59:55.971801 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:55 crc kubenswrapper[4807]: I1202 19:59:55.971771 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:55 crc kubenswrapper[4807]: E1202 19:59:55.971969 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:55 crc kubenswrapper[4807]: E1202 19:59:55.972140 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:55 crc kubenswrapper[4807]: E1202 19:59:55.972196 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:55 crc kubenswrapper[4807]: E1202 19:59:55.972265 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:55 crc kubenswrapper[4807]: I1202 19:59:55.973397 4807 scope.go:117] "RemoveContainer" containerID="4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c" Dec 02 19:59:56 crc kubenswrapper[4807]: I1202 19:59:56.823820 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/3.log" Dec 02 19:59:56 crc kubenswrapper[4807]: I1202 19:59:56.828580 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerStarted","Data":"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2"} Dec 02 19:59:56 crc kubenswrapper[4807]: I1202 19:59:56.829177 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 19:59:56 crc kubenswrapper[4807]: I1202 19:59:56.841119 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7z9t6"] Dec 02 19:59:56 crc kubenswrapper[4807]: I1202 19:59:56.841240 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:56 crc kubenswrapper[4807]: E1202 19:59:56.841342 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:56 crc kubenswrapper[4807]: I1202 19:59:56.872994 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podStartSLOduration=112.872973309 podStartE2EDuration="1m52.872973309s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 19:59:56.872084148 +0000 UTC m=+132.172991673" watchObservedRunningTime="2025-12-02 19:59:56.872973309 +0000 UTC m=+132.173880834" Dec 02 19:59:57 crc kubenswrapper[4807]: I1202 19:59:57.972279 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:57 crc kubenswrapper[4807]: I1202 19:59:57.972293 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:57 crc kubenswrapper[4807]: E1202 19:59:57.972919 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:57 crc kubenswrapper[4807]: I1202 19:59:57.972428 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:57 crc kubenswrapper[4807]: I1202 19:59:57.972348 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:57 crc kubenswrapper[4807]: E1202 19:59:57.973068 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:57 crc kubenswrapper[4807]: E1202 19:59:57.973156 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 19:59:57 crc kubenswrapper[4807]: E1202 19:59:57.973245 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:59 crc kubenswrapper[4807]: I1202 19:59:59.971698 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 19:59:59 crc kubenswrapper[4807]: I1202 19:59:59.971816 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 19:59:59 crc kubenswrapper[4807]: E1202 19:59:59.971895 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7z9t6" podUID="1cb49a08-30b0-4353-ad4a-23362f281475" Dec 02 19:59:59 crc kubenswrapper[4807]: I1202 19:59:59.971964 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 19:59:59 crc kubenswrapper[4807]: E1202 19:59:59.972018 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 19:59:59 crc kubenswrapper[4807]: I1202 19:59:59.972058 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 19:59:59 crc kubenswrapper[4807]: E1202 19:59:59.972162 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 19:59:59 crc kubenswrapper[4807]: E1202 19:59:59.972215 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.060904 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.094779 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kdp2m"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.095233 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.096900 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-p49z8"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.097753 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.098154 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.099144 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.099625 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.099787 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.099792 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.099791 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.125079 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.126952 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.127547 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.128385 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.128486 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.129698 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-49lk4"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.130089 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.130180 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-49lk4" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.130256 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.130496 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.130620 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.130937 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.131195 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.131943 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-frmzm"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.132115 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.132460 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-frmzm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.132904 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.133541 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.134087 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.134188 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n7sh5"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.134277 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.134522 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n7sh5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.136739 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.137316 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.137785 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.138072 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.139554 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.140249 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7727z"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.140459 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.140747 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.141366 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7c8mj"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.142007 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.142452 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.143189 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.162134 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.163986 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-config\") pod \"controller-manager-879f6c89f-kdp2m\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164046 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0dd45ac-1b9d-4274-96c1-98290213c989-config\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164080 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b0dd45ac-1b9d-4274-96c1-98290213c989-audit\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164097 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0dd45ac-1b9d-4274-96c1-98290213c989-trusted-ca-bundle\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164120 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b0dd45ac-1b9d-4274-96c1-98290213c989-image-import-ca\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164144 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-client-ca\") pod \"controller-manager-879f6c89f-kdp2m\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164168 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af5c02f0-23a0-45e5-80ae-3510d6d908dc-serving-cert\") pod \"controller-manager-879f6c89f-kdp2m\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164185 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b0dd45ac-1b9d-4274-96c1-98290213c989-etcd-serving-ca\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164201 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0dd45ac-1b9d-4274-96c1-98290213c989-serving-cert\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164218 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b0dd45ac-1b9d-4274-96c1-98290213c989-node-pullsecrets\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164235 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b0dd45ac-1b9d-4274-96c1-98290213c989-encryption-config\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164252 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz285\" (UniqueName: \"kubernetes.io/projected/af5c02f0-23a0-45e5-80ae-3510d6d908dc-kube-api-access-bz285\") pod \"controller-manager-879f6c89f-kdp2m\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164269 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b0dd45ac-1b9d-4274-96c1-98290213c989-etcd-client\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164283 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-kdp2m\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164310 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b0dd45ac-1b9d-4274-96c1-98290213c989-audit-dir\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164334 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25cgr\" (UniqueName: \"kubernetes.io/projected/b0dd45ac-1b9d-4274-96c1-98290213c989-kube-api-access-25cgr\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164973 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.165021 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.164982 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.165166 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.165184 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.165222 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.165636 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.165843 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.168296 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-b46dw"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.168814 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-b46dw" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.170815 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.171044 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.171215 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.171569 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.171751 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.171909 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.174338 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.175191 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.175875 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.176061 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.176088 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.176192 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.176328 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.176500 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.176622 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.176760 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.176624 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.176763 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.176910 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.176870 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.179201 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.180247 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.180852 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.180955 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.180864 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.181042 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.181077 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.181118 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.181131 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.181507 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.181625 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.181662 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.181770 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.183952 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.184345 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.184482 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.184660 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.184755 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.184808 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.184865 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.184954 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.185210 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.185391 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.185431 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.185455 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.185496 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.185578 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.185619 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.185629 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.185702 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.185711 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.185591 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.185797 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.185712 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.185989 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.186014 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.186116 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.186118 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.187305 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-bjmsc"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.187857 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.188804 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2c9sc"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.189301 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.189452 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.189571 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.190082 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.192166 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.192675 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.202257 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.202965 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.203018 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.203829 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.203950 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.203968 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.204062 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.204176 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.204263 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.204407 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.204423 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.204515 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.204560 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.205310 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.204063 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.206007 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.207243 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.215327 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.237090 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.237623 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-69994"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.239844 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.240031 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.250045 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.250057 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.250242 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.250478 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.251460 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.251998 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.253544 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.254253 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.254392 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-2kvcm"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.255016 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.258680 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.259052 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.261329 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265132 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbdcz\" (UniqueName: \"kubernetes.io/projected/744130db-de38-4c78-9684-95c04e411397-kube-api-access-xbdcz\") pod \"machine-config-controller-84d6567774-ssjb6\" (UID: \"744130db-de38-4c78-9684-95c04e411397\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265175 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxw7p\" (UniqueName: \"kubernetes.io/projected/3ef6cafd-6676-4a26-9b8f-96317dda91bf-kube-api-access-wxw7p\") pod \"package-server-manager-789f6589d5-cfdkz\" (UID: \"3ef6cafd-6676-4a26-9b8f-96317dda91bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265203 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e488a159-e6fd-4ede-af64-43b8b1f31e4f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-qc728\" (UID: \"e488a159-e6fd-4ede-af64-43b8b1f31e4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265231 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b0dd45ac-1b9d-4274-96c1-98290213c989-audit\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265254 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0dd45ac-1b9d-4274-96c1-98290213c989-trusted-ca-bundle\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265272 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2c9sc\" (UID: \"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265296 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb4cr\" (UniqueName: \"kubernetes.io/projected/631034d3-e8b2-42a8-9566-8f8922464b56-kube-api-access-zb4cr\") pod \"cluster-samples-operator-665b6dd947-49lk4\" (UID: \"631034d3-e8b2-42a8-9566-8f8922464b56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-49lk4" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265314 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98cb9822-a3b6-48a8-93ce-1b619a9a8e7c-config\") pod \"authentication-operator-69f744f599-7727z\" (UID: \"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265331 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/580f1600-09f3-46f0-8429-1846d2f46ba7-srv-cert\") pod \"olm-operator-6b444d44fb-zm2nv\" (UID: \"580f1600-09f3-46f0-8429-1846d2f46ba7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265350 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-886d9\" (UniqueName: \"kubernetes.io/projected/b5784f10-10dd-4363-a50a-60d37b9c9ec5-kube-api-access-886d9\") pod \"downloads-7954f5f757-frmzm\" (UID: \"b5784f10-10dd-4363-a50a-60d37b9c9ec5\") " pod="openshift-console/downloads-7954f5f757-frmzm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265365 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85db3d77-931a-4b82-90e7-acb77f874edc-auth-proxy-config\") pod \"machine-approver-56656f9798-b9nt9\" (UID: \"85db3d77-931a-4b82-90e7-acb77f874edc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265383 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-serving-cert\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265400 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-config\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265416 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265433 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt7k7\" (UniqueName: \"kubernetes.io/projected/f69009ba-27a8-4518-b6d9-97194d8a77b3-kube-api-access-gt7k7\") pod \"cluster-image-registry-operator-dc59b4c8b-kq6q9\" (UID: \"f69009ba-27a8-4518-b6d9-97194d8a77b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265451 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5a670da2-9970-471b-93fb-7dce5fe66c94-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cmwzb\" (UID: \"5a670da2-9970-471b-93fb-7dce5fe66c94\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265470 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0622ab0a-2b5b-4f71-a9b1-8573297d6351-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x9bfb\" (UID: \"0622ab0a-2b5b-4f71-a9b1-8573297d6351\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265484 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0622ab0a-2b5b-4f71-a9b1-8573297d6351-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x9bfb\" (UID: \"0622ab0a-2b5b-4f71-a9b1-8573297d6351\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265497 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-oauth-serving-cert\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265516 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b0dd45ac-1b9d-4274-96c1-98290213c989-image-import-ca\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265541 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-audit-policies\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265557 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj2lc\" (UniqueName: \"kubernetes.io/projected/85db3d77-931a-4b82-90e7-acb77f874edc-kube-api-access-bj2lc\") pod \"machine-approver-56656f9798-b9nt9\" (UID: \"85db3d77-931a-4b82-90e7-acb77f874edc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265577 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-client-ca\") pod \"controller-manager-879f6c89f-kdp2m\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265593 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvp6f\" (UniqueName: \"kubernetes.io/projected/98cb9822-a3b6-48a8-93ce-1b619a9a8e7c-kube-api-access-kvp6f\") pod \"authentication-operator-69f744f599-7727z\" (UID: \"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265617 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-oauth-config\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265640 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af5c02f0-23a0-45e5-80ae-3510d6d908dc-serving-cert\") pod \"controller-manager-879f6c89f-kdp2m\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265657 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58f0bd92-2b1f-4d7e-899f-556bbf8cdf00-config\") pod \"console-operator-58897d9998-b46dw\" (UID: \"58f0bd92-2b1f-4d7e-899f-556bbf8cdf00\") " pod="openshift-console-operator/console-operator-58897d9998-b46dw" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265672 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kmk94\" (UID: \"bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265690 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b0dd45ac-1b9d-4274-96c1-98290213c989-etcd-serving-ca\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265707 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0dd45ac-1b9d-4274-96c1-98290213c989-serving-cert\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265758 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98cb9822-a3b6-48a8-93ce-1b619a9a8e7c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7727z\" (UID: \"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265775 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwq4h\" (UniqueName: \"kubernetes.io/projected/96d50594-2379-4758-9882-7328d7bdf1fb-kube-api-access-bwq4h\") pod \"kube-storage-version-migrator-operator-b67b599dd-tt5pr\" (UID: \"96d50594-2379-4758-9882-7328d7bdf1fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265795 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f69009ba-27a8-4518-b6d9-97194d8a77b3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kq6q9\" (UID: \"f69009ba-27a8-4518-b6d9-97194d8a77b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265821 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f69009ba-27a8-4518-b6d9-97194d8a77b3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kq6q9\" (UID: \"f69009ba-27a8-4518-b6d9-97194d8a77b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265838 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqscq\" (UniqueName: \"kubernetes.io/projected/e488a159-e6fd-4ede-af64-43b8b1f31e4f-kube-api-access-kqscq\") pod \"openshift-apiserver-operator-796bbdcf4f-qc728\" (UID: \"e488a159-e6fd-4ede-af64-43b8b1f31e4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265851 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98cb9822-a3b6-48a8-93ce-1b619a9a8e7c-serving-cert\") pod \"authentication-operator-69f744f599-7727z\" (UID: \"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265869 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/85db3d77-931a-4b82-90e7-acb77f874edc-machine-approver-tls\") pod \"machine-approver-56656f9798-b9nt9\" (UID: \"85db3d77-931a-4b82-90e7-acb77f874edc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265886 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljw4w\" (UniqueName: \"kubernetes.io/projected/10c98701-5de0-4c9b-a109-01a2985dc868-kube-api-access-ljw4w\") pod \"route-controller-manager-6576b87f9c-rzpzb\" (UID: \"10c98701-5de0-4c9b-a109-01a2985dc868\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265903 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kmk94\" (UID: \"bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265920 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b0dd45ac-1b9d-4274-96c1-98290213c989-node-pullsecrets\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265939 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz285\" (UniqueName: \"kubernetes.io/projected/af5c02f0-23a0-45e5-80ae-3510d6d908dc-kube-api-access-bz285\") pod \"controller-manager-879f6c89f-kdp2m\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265958 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-etcd-client\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265973 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/744130db-de38-4c78-9684-95c04e411397-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ssjb6\" (UID: \"744130db-de38-4c78-9684-95c04e411397\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.265988 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98cb9822-a3b6-48a8-93ce-1b619a9a8e7c-service-ca-bundle\") pod \"authentication-operator-69f744f599-7727z\" (UID: \"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266004 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2c9sc\" (UID: \"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266021 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b0dd45ac-1b9d-4274-96c1-98290213c989-encryption-config\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266038 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8-images\") pod \"machine-api-operator-5694c8668f-7c8mj\" (UID: \"3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266052 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-trusted-ca-bundle\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266069 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-service-ca\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266085 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f69009ba-27a8-4518-b6d9-97194d8a77b3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kq6q9\" (UID: \"f69009ba-27a8-4518-b6d9-97194d8a77b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266104 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-kdp2m\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266122 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dqrz\" (UniqueName: \"kubernetes.io/projected/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-kube-api-access-8dqrz\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266143 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/744130db-de38-4c78-9684-95c04e411397-proxy-tls\") pod \"machine-config-controller-84d6567774-ssjb6\" (UID: \"744130db-de38-4c78-9684-95c04e411397\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266160 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7c8mj\" (UID: \"3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266178 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58f0bd92-2b1f-4d7e-899f-556bbf8cdf00-trusted-ca\") pod \"console-operator-58897d9998-b46dw\" (UID: \"58f0bd92-2b1f-4d7e-899f-556bbf8cdf00\") " pod="openshift-console-operator/console-operator-58897d9998-b46dw" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266195 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b0dd45ac-1b9d-4274-96c1-98290213c989-etcd-client\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266215 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqbz2\" (UniqueName: \"kubernetes.io/projected/350fb173-d354-4879-9e3b-c2429a682c05-kube-api-access-qqbz2\") pod \"catalog-operator-68c6474976-g89wb\" (UID: \"350fb173-d354-4879-9e3b-c2429a682c05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266231 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ef6cafd-6676-4a26-9b8f-96317dda91bf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cfdkz\" (UID: \"3ef6cafd-6676-4a26-9b8f-96317dda91bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266251 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b0dd45ac-1b9d-4274-96c1-98290213c989-audit-dir\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266269 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266286 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8-config\") pod \"machine-api-operator-5694c8668f-7c8mj\" (UID: \"3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266302 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgbk9\" (UniqueName: \"kubernetes.io/projected/58f0bd92-2b1f-4d7e-899f-556bbf8cdf00-kube-api-access-dgbk9\") pod \"console-operator-58897d9998-b46dw\" (UID: \"58f0bd92-2b1f-4d7e-899f-556bbf8cdf00\") " pod="openshift-console-operator/console-operator-58897d9998-b46dw" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266318 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96d50594-2379-4758-9882-7328d7bdf1fb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tt5pr\" (UID: \"96d50594-2379-4758-9882-7328d7bdf1fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266332 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58f0bd92-2b1f-4d7e-899f-556bbf8cdf00-serving-cert\") pod \"console-operator-58897d9998-b46dw\" (UID: \"58f0bd92-2b1f-4d7e-899f-556bbf8cdf00\") " pod="openshift-console-operator/console-operator-58897d9998-b46dw" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266353 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcwpx\" (UniqueName: \"kubernetes.io/projected/617adf8b-8ca9-4578-8c24-8f6b22713567-kube-api-access-qcwpx\") pod \"control-plane-machine-set-operator-78cbb6b69f-n7sh5\" (UID: \"617adf8b-8ca9-4578-8c24-8f6b22713567\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n7sh5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266375 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25cgr\" (UniqueName: \"kubernetes.io/projected/b0dd45ac-1b9d-4274-96c1-98290213c989-kube-api-access-25cgr\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266392 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-serving-cert\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266408 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10c98701-5de0-4c9b-a109-01a2985dc868-client-ca\") pod \"route-controller-manager-6576b87f9c-rzpzb\" (UID: \"10c98701-5de0-4c9b-a109-01a2985dc868\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266427 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0622ab0a-2b5b-4f71-a9b1-8573297d6351-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x9bfb\" (UID: \"0622ab0a-2b5b-4f71-a9b1-8573297d6351\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266444 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/580f1600-09f3-46f0-8429-1846d2f46ba7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zm2nv\" (UID: \"580f1600-09f3-46f0-8429-1846d2f46ba7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266461 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-config\") pod \"controller-manager-879f6c89f-kdp2m\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266476 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10c98701-5de0-4c9b-a109-01a2985dc868-serving-cert\") pod \"route-controller-manager-6576b87f9c-rzpzb\" (UID: \"10c98701-5de0-4c9b-a109-01a2985dc868\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266493 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp587\" (UniqueName: \"kubernetes.io/projected/5a670da2-9970-471b-93fb-7dce5fe66c94-kube-api-access-cp587\") pod \"openshift-config-operator-7777fb866f-cmwzb\" (UID: \"5a670da2-9970-471b-93fb-7dce5fe66c94\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266510 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5-config\") pod \"kube-controller-manager-operator-78b949d7b-kmk94\" (UID: \"bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266530 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e488a159-e6fd-4ede-af64-43b8b1f31e4f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-qc728\" (UID: \"e488a159-e6fd-4ede-af64-43b8b1f31e4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266555 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96d50594-2379-4758-9882-7328d7bdf1fb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tt5pr\" (UID: \"96d50594-2379-4758-9882-7328d7bdf1fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266573 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a670da2-9970-471b-93fb-7dce5fe66c94-serving-cert\") pod \"openshift-config-operator-7777fb866f-cmwzb\" (UID: \"5a670da2-9970-471b-93fb-7dce5fe66c94\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266591 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smj9s\" (UniqueName: \"kubernetes.io/projected/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-kube-api-access-smj9s\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266614 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0dd45ac-1b9d-4274-96c1-98290213c989-config\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266673 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85db3d77-931a-4b82-90e7-acb77f874edc-config\") pod \"machine-approver-56656f9798-b9nt9\" (UID: \"85db3d77-931a-4b82-90e7-acb77f874edc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266693 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mljmt\" (UniqueName: \"kubernetes.io/projected/580f1600-09f3-46f0-8429-1846d2f46ba7-kube-api-access-mljmt\") pod \"olm-operator-6b444d44fb-zm2nv\" (UID: \"580f1600-09f3-46f0-8429-1846d2f46ba7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266732 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8thbr\" (UniqueName: \"kubernetes.io/projected/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-kube-api-access-8thbr\") pod \"marketplace-operator-79b997595-2c9sc\" (UID: \"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266756 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw5hl\" (UniqueName: \"kubernetes.io/projected/3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8-kube-api-access-dw5hl\") pod \"machine-api-operator-5694c8668f-7c8mj\" (UID: \"3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266776 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10c98701-5de0-4c9b-a109-01a2985dc868-config\") pod \"route-controller-manager-6576b87f9c-rzpzb\" (UID: \"10c98701-5de0-4c9b-a109-01a2985dc868\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266796 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/350fb173-d354-4879-9e3b-c2429a682c05-profile-collector-cert\") pod \"catalog-operator-68c6474976-g89wb\" (UID: \"350fb173-d354-4879-9e3b-c2429a682c05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266806 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0dd45ac-1b9d-4274-96c1-98290213c989-trusted-ca-bundle\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.266814 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/617adf8b-8ca9-4578-8c24-8f6b22713567-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-n7sh5\" (UID: \"617adf8b-8ca9-4578-8c24-8f6b22713567\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n7sh5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.267390 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-encryption-config\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.267417 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-audit-dir\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.267436 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/350fb173-d354-4879-9e3b-c2429a682c05-srv-cert\") pod \"catalog-operator-68c6474976-g89wb\" (UID: \"350fb173-d354-4879-9e3b-c2429a682c05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.267453 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/631034d3-e8b2-42a8-9566-8f8922464b56-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-49lk4\" (UID: \"631034d3-e8b2-42a8-9566-8f8922464b56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-49lk4" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.267999 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b0dd45ac-1b9d-4274-96c1-98290213c989-audit\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.268081 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-pzlxq"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.268788 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b0dd45ac-1b9d-4274-96c1-98290213c989-image-import-ca\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.268878 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.268937 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-client-ca\") pod \"controller-manager-879f6c89f-kdp2m\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.269313 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v4lp2"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.269750 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.270395 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-pzlxq" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.270551 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.271884 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b0dd45ac-1b9d-4274-96c1-98290213c989-audit-dir\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.272077 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b0dd45ac-1b9d-4274-96c1-98290213c989-node-pullsecrets\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.272685 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b0dd45ac-1b9d-4274-96c1-98290213c989-etcd-serving-ca\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.272795 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0dd45ac-1b9d-4274-96c1-98290213c989-config\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.273406 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-config\") pod \"controller-manager-879f6c89f-kdp2m\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.274238 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-kdp2m\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.274550 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cbpn8"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.275324 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-cbpn8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.276974 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.277107 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af5c02f0-23a0-45e5-80ae-3510d6d908dc-serving-cert\") pod \"controller-manager-879f6c89f-kdp2m\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.277648 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.277835 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.278250 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.278634 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.279453 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-n9g5h"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.280079 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.280300 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b0dd45ac-1b9d-4274-96c1-98290213c989-etcd-client\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.281172 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xgm55"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.281643 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0dd45ac-1b9d-4274-96c1-98290213c989-serving-cert\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.282027 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xgm55" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.282364 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jh4tx"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.283221 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.283631 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lmjwv"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.284406 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lmjwv" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.284788 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b0dd45ac-1b9d-4274-96c1-98290213c989-encryption-config\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.284926 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c4hb8"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.286186 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-c4hb8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.286358 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-49lk4"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.287857 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kdp2m"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.289158 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-p49z8"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.290265 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.291618 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.293209 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.294286 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.295546 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-frmzm"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.296774 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.298027 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.298877 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.304369 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-bjmsc"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.306264 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.312334 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7727z"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.313685 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.314766 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.316066 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-69994"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.316947 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v4lp2"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.317793 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.318036 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.319090 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n7sh5"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.320198 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-n9g5h"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.321141 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-v476v"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.322145 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v476v" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.322265 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.323319 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.324394 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lmjwv"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.325486 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7c8mj"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.326678 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.327967 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jh4tx"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.331421 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.333439 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2c9sc"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.340362 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.340415 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xgm55"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.341488 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-b8g8z"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.342510 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-dsdtv"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.342775 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-b8g8z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.343042 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dsdtv" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.343657 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cbpn8"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.345595 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-b46dw"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.347607 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-pzlxq"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.349815 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.349915 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v476v"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.350869 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c4hb8"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.351948 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.353031 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-b8g8z"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.354152 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6j8qw"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.355710 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.356984 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6j8qw"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.357183 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.368667 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/580f1600-09f3-46f0-8429-1846d2f46ba7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zm2nv\" (UID: \"580f1600-09f3-46f0-8429-1846d2f46ba7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.368701 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0622ab0a-2b5b-4f71-a9b1-8573297d6351-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x9bfb\" (UID: \"0622ab0a-2b5b-4f71-a9b1-8573297d6351\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.368749 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48zzt\" (UniqueName: \"kubernetes.io/projected/f1b36133-0f4e-49bf-b33d-224112b2a964-kube-api-access-48zzt\") pod \"migrator-59844c95c7-lmjwv\" (UID: \"f1b36133-0f4e-49bf-b33d-224112b2a964\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lmjwv" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.368781 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g8x5\" (UniqueName: \"kubernetes.io/projected/63a3212f-beb2-4bc0-b88a-947b3a7653c2-kube-api-access-2g8x5\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.368813 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/30d55e05-d66f-496a-b4e1-fb6deb38895f-audit-dir\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.368839 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp587\" (UniqueName: \"kubernetes.io/projected/5a670da2-9970-471b-93fb-7dce5fe66c94-kube-api-access-cp587\") pod \"openshift-config-operator-7777fb866f-cmwzb\" (UID: \"5a670da2-9970-471b-93fb-7dce5fe66c94\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.368873 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5-config\") pod \"kube-controller-manager-operator-78b949d7b-kmk94\" (UID: \"bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.368898 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.368935 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zqh9\" (UniqueName: \"kubernetes.io/projected/3a3480f6-d49a-4f37-9f9a-605d9efc851e-kube-api-access-8zqh9\") pod \"multus-admission-controller-857f4d67dd-c4hb8\" (UID: \"3a3480f6-d49a-4f37-9f9a-605d9efc851e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c4hb8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.368961 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369002 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96d50594-2379-4758-9882-7328d7bdf1fb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tt5pr\" (UID: \"96d50594-2379-4758-9882-7328d7bdf1fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369037 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0555dc0e-5155-49f2-8c17-cfd091afacf9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-69994\" (UID: \"0555dc0e-5155-49f2-8c17-cfd091afacf9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369079 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/623a789c-ee08-43de-b81c-8f1499ab8fbc-config\") pod \"kube-apiserver-operator-766d6c64bb-89tmm\" (UID: \"623a789c-ee08-43de-b81c-8f1499ab8fbc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369129 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85db3d77-931a-4b82-90e7-acb77f874edc-config\") pod \"machine-approver-56656f9798-b9nt9\" (UID: \"85db3d77-931a-4b82-90e7-acb77f874edc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369175 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mljmt\" (UniqueName: \"kubernetes.io/projected/580f1600-09f3-46f0-8429-1846d2f46ba7-kube-api-access-mljmt\") pod \"olm-operator-6b444d44fb-zm2nv\" (UID: \"580f1600-09f3-46f0-8429-1846d2f46ba7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369196 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/350fb173-d354-4879-9e3b-c2429a682c05-profile-collector-cert\") pod \"catalog-operator-68c6474976-g89wb\" (UID: \"350fb173-d354-4879-9e3b-c2429a682c05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369220 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/617adf8b-8ca9-4578-8c24-8f6b22713567-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-n7sh5\" (UID: \"617adf8b-8ca9-4578-8c24-8f6b22713567\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n7sh5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369250 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8d9r\" (UniqueName: \"kubernetes.io/projected/3d3a8832-4745-4b77-b855-edea43d079d4-kube-api-access-q8d9r\") pod \"dns-operator-744455d44c-cbpn8\" (UID: \"3d3a8832-4745-4b77-b855-edea43d079d4\") " pod="openshift-dns-operator/dns-operator-744455d44c-cbpn8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369275 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/350fb173-d354-4879-9e3b-c2429a682c05-srv-cert\") pod \"catalog-operator-68c6474976-g89wb\" (UID: \"350fb173-d354-4879-9e3b-c2429a682c05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369305 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/631034d3-e8b2-42a8-9566-8f8922464b56-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-49lk4\" (UID: \"631034d3-e8b2-42a8-9566-8f8922464b56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-49lk4" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369337 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a3480f6-d49a-4f37-9f9a-605d9efc851e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c4hb8\" (UID: \"3a3480f6-d49a-4f37-9f9a-605d9efc851e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c4hb8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369387 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxw7p\" (UniqueName: \"kubernetes.io/projected/3ef6cafd-6676-4a26-9b8f-96317dda91bf-kube-api-access-wxw7p\") pod \"package-server-manager-789f6589d5-cfdkz\" (UID: \"3ef6cafd-6676-4a26-9b8f-96317dda91bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369422 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/947012be-68b2-4b9a-b020-258b9b1f0ce8-proxy-tls\") pod \"machine-config-operator-74547568cd-w42b5\" (UID: \"947012be-68b2-4b9a-b020-258b9b1f0ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369451 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55c432b0-edb0-4988-994e-0a888a54621a-serving-cert\") pod \"service-ca-operator-777779d784-xgm55\" (UID: \"55c432b0-edb0-4988-994e-0a888a54621a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xgm55" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369476 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae77852d-9558-4ceb-9eef-1b65bb912a92-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-k6bt9\" (UID: \"ae77852d-9558-4ceb-9eef-1b65bb912a92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369504 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d66ba69f-89dd-49ea-91f6-6682ce7bdc8a-webhook-cert\") pod \"packageserver-d55dfcdfc-8sxvh\" (UID: \"d66ba69f-89dd-49ea-91f6-6682ce7bdc8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369534 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbdcz\" (UniqueName: \"kubernetes.io/projected/744130db-de38-4c78-9684-95c04e411397-kube-api-access-xbdcz\") pod \"machine-config-controller-84d6567774-ssjb6\" (UID: \"744130db-de38-4c78-9684-95c04e411397\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369559 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb4cr\" (UniqueName: \"kubernetes.io/projected/631034d3-e8b2-42a8-9566-8f8922464b56-kube-api-access-zb4cr\") pod \"cluster-samples-operator-665b6dd947-49lk4\" (UID: \"631034d3-e8b2-42a8-9566-8f8922464b56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-49lk4" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369587 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c3b0b579-6db1-463a-b391-6799f284b89a-default-certificate\") pod \"router-default-5444994796-2kvcm\" (UID: \"c3b0b579-6db1-463a-b391-6799f284b89a\") " pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369621 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98cb9822-a3b6-48a8-93ce-1b619a9a8e7c-config\") pod \"authentication-operator-69f744f599-7727z\" (UID: \"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369650 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-886d9\" (UniqueName: \"kubernetes.io/projected/b5784f10-10dd-4363-a50a-60d37b9c9ec5-kube-api-access-886d9\") pod \"downloads-7954f5f757-frmzm\" (UID: \"b5784f10-10dd-4363-a50a-60d37b9c9ec5\") " pod="openshift-console/downloads-7954f5f757-frmzm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369677 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-serving-cert\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369701 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-config\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369762 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2v7j\" (UniqueName: \"kubernetes.io/projected/ae77852d-9558-4ceb-9eef-1b65bb912a92-kube-api-access-p2v7j\") pod \"openshift-controller-manager-operator-756b6f6bc6-k6bt9\" (UID: \"ae77852d-9558-4ceb-9eef-1b65bb912a92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369795 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt7k7\" (UniqueName: \"kubernetes.io/projected/f69009ba-27a8-4518-b6d9-97194d8a77b3-kube-api-access-gt7k7\") pod \"cluster-image-registry-operator-dc59b4c8b-kq6q9\" (UID: \"f69009ba-27a8-4518-b6d9-97194d8a77b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369821 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369832 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0622ab0a-2b5b-4f71-a9b1-8573297d6351-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x9bfb\" (UID: \"0622ab0a-2b5b-4f71-a9b1-8573297d6351\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369850 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-oauth-serving-cert\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369874 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5-config\") pod \"kube-controller-manager-operator-78b949d7b-kmk94\" (UID: \"bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369896 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-audit-policies\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.369959 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj2lc\" (UniqueName: \"kubernetes.io/projected/85db3d77-931a-4b82-90e7-acb77f874edc-kube-api-access-bj2lc\") pod \"machine-approver-56656f9798-b9nt9\" (UID: \"85db3d77-931a-4b82-90e7-acb77f874edc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370001 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-ca\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370027 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-oauth-config\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370065 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58f0bd92-2b1f-4d7e-899f-556bbf8cdf00-config\") pod \"console-operator-58897d9998-b46dw\" (UID: \"58f0bd92-2b1f-4d7e-899f-556bbf8cdf00\") " pod="openshift-console-operator/console-operator-58897d9998-b46dw" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370085 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370111 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-audit-policies\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370108 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85db3d77-931a-4b82-90e7-acb77f874edc-config\") pod \"machine-approver-56656f9798-b9nt9\" (UID: \"85db3d77-931a-4b82-90e7-acb77f874edc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370162 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370190 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f69009ba-27a8-4518-b6d9-97194d8a77b3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kq6q9\" (UID: \"f69009ba-27a8-4518-b6d9-97194d8a77b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370209 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqscq\" (UniqueName: \"kubernetes.io/projected/e488a159-e6fd-4ede-af64-43b8b1f31e4f-kube-api-access-kqscq\") pod \"openshift-apiserver-operator-796bbdcf4f-qc728\" (UID: \"e488a159-e6fd-4ede-af64-43b8b1f31e4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370228 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kmk94\" (UID: \"bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370249 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdwlh\" (UniqueName: \"kubernetes.io/projected/947012be-68b2-4b9a-b020-258b9b1f0ce8-kube-api-access-wdwlh\") pod \"machine-config-operator-74547568cd-w42b5\" (UID: \"947012be-68b2-4b9a-b020-258b9b1f0ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370276 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98cb9822-a3b6-48a8-93ce-1b619a9a8e7c-serving-cert\") pod \"authentication-operator-69f744f599-7727z\" (UID: \"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370296 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2c9sc\" (UID: \"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370317 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370343 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98cb9822-a3b6-48a8-93ce-1b619a9a8e7c-service-ca-bundle\") pod \"authentication-operator-69f744f599-7727z\" (UID: \"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370367 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-service-ca\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370388 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7c8mj\" (UID: \"3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370574 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqbz2\" (UniqueName: \"kubernetes.io/projected/350fb173-d354-4879-9e3b-c2429a682c05-kube-api-access-qqbz2\") pod \"catalog-operator-68c6474976-g89wb\" (UID: \"350fb173-d354-4879-9e3b-c2429a682c05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.370663 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-oauth-serving-cert\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.371704 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-audit-policies\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.371796 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-config\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.372085 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98cb9822-a3b6-48a8-93ce-1b619a9a8e7c-service-ca-bundle\") pod \"authentication-operator-69f744f599-7727z\" (UID: \"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.372166 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58f0bd92-2b1f-4d7e-899f-556bbf8cdf00-config\") pod \"console-operator-58897d9998-b46dw\" (UID: \"58f0bd92-2b1f-4d7e-899f-556bbf8cdf00\") " pod="openshift-console-operator/console-operator-58897d9998-b46dw" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.372200 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ef6cafd-6676-4a26-9b8f-96317dda91bf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cfdkz\" (UID: \"3ef6cafd-6676-4a26-9b8f-96317dda91bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.372228 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/947012be-68b2-4b9a-b020-258b9b1f0ce8-images\") pod \"machine-config-operator-74547568cd-w42b5\" (UID: \"947012be-68b2-4b9a-b020-258b9b1f0ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.372354 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxg5r\" (UniqueName: \"kubernetes.io/projected/55c432b0-edb0-4988-994e-0a888a54621a-kube-api-access-cxg5r\") pod \"service-ca-operator-777779d784-xgm55\" (UID: \"55c432b0-edb0-4988-994e-0a888a54621a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xgm55" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.372400 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dqrz\" (UniqueName: \"kubernetes.io/projected/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-kube-api-access-8dqrz\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.372564 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.372613 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/30a68e73-24db-4b31-a2d3-7d25eb88f0ff-signing-key\") pod \"service-ca-9c57cc56f-pzlxq\" (UID: \"30a68e73-24db-4b31-a2d3-7d25eb88f0ff\") " pod="openshift-service-ca/service-ca-9c57cc56f-pzlxq" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.372690 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96d50594-2379-4758-9882-7328d7bdf1fb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tt5pr\" (UID: \"96d50594-2379-4758-9882-7328d7bdf1fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.372756 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcwpx\" (UniqueName: \"kubernetes.io/projected/617adf8b-8ca9-4578-8c24-8f6b22713567-kube-api-access-qcwpx\") pod \"control-plane-machine-set-operator-78cbb6b69f-n7sh5\" (UID: \"617adf8b-8ca9-4578-8c24-8f6b22713567\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n7sh5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.372982 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.373032 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98cb9822-a3b6-48a8-93ce-1b619a9a8e7c-config\") pod \"authentication-operator-69f744f599-7727z\" (UID: \"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.373085 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-serving-cert\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.373140 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10c98701-5de0-4c9b-a109-01a2985dc868-client-ca\") pod \"route-controller-manager-6576b87f9c-rzpzb\" (UID: \"10c98701-5de0-4c9b-a109-01a2985dc868\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.373278 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10c98701-5de0-4c9b-a109-01a2985dc868-serving-cert\") pod \"route-controller-manager-6576b87f9c-rzpzb\" (UID: \"10c98701-5de0-4c9b-a109-01a2985dc868\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.373294 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/617adf8b-8ca9-4578-8c24-8f6b22713567-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-n7sh5\" (UID: \"617adf8b-8ca9-4578-8c24-8f6b22713567\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n7sh5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.373587 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96d50594-2379-4758-9882-7328d7bdf1fb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tt5pr\" (UID: \"96d50594-2379-4758-9882-7328d7bdf1fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.373765 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.373797 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e488a159-e6fd-4ede-af64-43b8b1f31e4f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-qc728\" (UID: \"e488a159-e6fd-4ede-af64-43b8b1f31e4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.373861 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a670da2-9970-471b-93fb-7dce5fe66c94-serving-cert\") pod \"openshift-config-operator-7777fb866f-cmwzb\" (UID: \"5a670da2-9970-471b-93fb-7dce5fe66c94\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.373892 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smj9s\" (UniqueName: \"kubernetes.io/projected/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-kube-api-access-smj9s\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.373985 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96d50594-2379-4758-9882-7328d7bdf1fb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tt5pr\" (UID: \"96d50594-2379-4758-9882-7328d7bdf1fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374091 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/580f1600-09f3-46f0-8429-1846d2f46ba7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zm2nv\" (UID: \"580f1600-09f3-46f0-8429-1846d2f46ba7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374183 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10c98701-5de0-4c9b-a109-01a2985dc868-client-ca\") pod \"route-controller-manager-6576b87f9c-rzpzb\" (UID: \"10c98701-5de0-4c9b-a109-01a2985dc868\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374249 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3b0b579-6db1-463a-b391-6799f284b89a-metrics-certs\") pod \"router-default-5444994796-2kvcm\" (UID: \"c3b0b579-6db1-463a-b391-6799f284b89a\") " pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374280 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d66ba69f-89dd-49ea-91f6-6682ce7bdc8a-apiservice-cert\") pod \"packageserver-d55dfcdfc-8sxvh\" (UID: \"d66ba69f-89dd-49ea-91f6-6682ce7bdc8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374326 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8thbr\" (UniqueName: \"kubernetes.io/projected/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-kube-api-access-8thbr\") pod \"marketplace-operator-79b997595-2c9sc\" (UID: \"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374362 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c3b0b579-6db1-463a-b391-6799f284b89a-stats-auth\") pod \"router-default-5444994796-2kvcm\" (UID: \"c3b0b579-6db1-463a-b391-6799f284b89a\") " pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374401 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw5hl\" (UniqueName: \"kubernetes.io/projected/3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8-kube-api-access-dw5hl\") pod \"machine-api-operator-5694c8668f-7c8mj\" (UID: \"3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374424 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10c98701-5de0-4c9b-a109-01a2985dc868-config\") pod \"route-controller-manager-6576b87f9c-rzpzb\" (UID: \"10c98701-5de0-4c9b-a109-01a2985dc868\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374441 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0555dc0e-5155-49f2-8c17-cfd091afacf9-metrics-tls\") pod \"ingress-operator-5b745b69d9-69994\" (UID: \"0555dc0e-5155-49f2-8c17-cfd091afacf9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374460 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-encryption-config\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374477 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-audit-dir\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374500 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-client\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374519 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwfhg\" (UniqueName: \"kubernetes.io/projected/0555dc0e-5155-49f2-8c17-cfd091afacf9-kube-api-access-qwfhg\") pod \"ingress-operator-5b745b69d9-69994\" (UID: \"0555dc0e-5155-49f2-8c17-cfd091afacf9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374538 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374556 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/623a789c-ee08-43de-b81c-8f1499ab8fbc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-89tmm\" (UID: \"623a789c-ee08-43de-b81c-8f1499ab8fbc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374579 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e488a159-e6fd-4ede-af64-43b8b1f31e4f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-qc728\" (UID: \"e488a159-e6fd-4ede-af64-43b8b1f31e4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374603 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/623a789c-ee08-43de-b81c-8f1499ab8fbc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-89tmm\" (UID: \"623a789c-ee08-43de-b81c-8f1499ab8fbc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374704 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-audit-dir\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374803 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2c9sc\" (UID: \"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.374840 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d3a8832-4745-4b77-b855-edea43d079d4-metrics-tls\") pod \"dns-operator-744455d44c-cbpn8\" (UID: \"3d3a8832-4745-4b77-b855-edea43d079d4\") " pod="openshift-dns-operator/dns-operator-744455d44c-cbpn8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375043 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/580f1600-09f3-46f0-8429-1846d2f46ba7-srv-cert\") pod \"olm-operator-6b444d44fb-zm2nv\" (UID: \"580f1600-09f3-46f0-8429-1846d2f46ba7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375088 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc5kl\" (UniqueName: \"kubernetes.io/projected/30d55e05-d66f-496a-b4e1-fb6deb38895f-kube-api-access-wc5kl\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375150 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85db3d77-931a-4b82-90e7-acb77f874edc-auth-proxy-config\") pod \"machine-approver-56656f9798-b9nt9\" (UID: \"85db3d77-931a-4b82-90e7-acb77f874edc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375199 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375316 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375360 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5a670da2-9970-471b-93fb-7dce5fe66c94-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cmwzb\" (UID: \"5a670da2-9970-471b-93fb-7dce5fe66c94\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375388 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0622ab0a-2b5b-4f71-a9b1-8573297d6351-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x9bfb\" (UID: \"0622ab0a-2b5b-4f71-a9b1-8573297d6351\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375448 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/30a68e73-24db-4b31-a2d3-7d25eb88f0ff-signing-cabundle\") pod \"service-ca-9c57cc56f-pzlxq\" (UID: \"30a68e73-24db-4b31-a2d3-7d25eb88f0ff\") " pod="openshift-service-ca/service-ca-9c57cc56f-pzlxq" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375472 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0622ab0a-2b5b-4f71-a9b1-8573297d6351-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x9bfb\" (UID: \"0622ab0a-2b5b-4f71-a9b1-8573297d6351\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375496 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63a3212f-beb2-4bc0-b88a-947b3a7653c2-serving-cert\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375520 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-service-ca\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375566 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d66ba69f-89dd-49ea-91f6-6682ce7bdc8a-tmpfs\") pod \"packageserver-d55dfcdfc-8sxvh\" (UID: \"d66ba69f-89dd-49ea-91f6-6682ce7bdc8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375587 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3b0b579-6db1-463a-b391-6799f284b89a-service-ca-bundle\") pod \"router-default-5444994796-2kvcm\" (UID: \"c3b0b579-6db1-463a-b391-6799f284b89a\") " pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375610 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375633 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvp6f\" (UniqueName: \"kubernetes.io/projected/98cb9822-a3b6-48a8-93ce-1b619a9a8e7c-kube-api-access-kvp6f\") pod \"authentication-operator-69f744f599-7727z\" (UID: \"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375675 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/947012be-68b2-4b9a-b020-258b9b1f0ce8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-w42b5\" (UID: \"947012be-68b2-4b9a-b020-258b9b1f0ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375699 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5k9s\" (UniqueName: \"kubernetes.io/projected/11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d-kube-api-access-r5k9s\") pod \"ingress-canary-v476v\" (UID: \"11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d\") " pod="openshift-ingress-canary/ingress-canary-v476v" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375737 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375759 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae77852d-9558-4ceb-9eef-1b65bb912a92-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-k6bt9\" (UID: \"ae77852d-9558-4ceb-9eef-1b65bb912a92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375789 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10c98701-5de0-4c9b-a109-01a2985dc868-config\") pod \"route-controller-manager-6576b87f9c-rzpzb\" (UID: \"10c98701-5de0-4c9b-a109-01a2985dc868\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375926 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kmk94\" (UID: \"bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.375993 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98cb9822-a3b6-48a8-93ce-1b619a9a8e7c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7727z\" (UID: \"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.376036 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwq4h\" (UniqueName: \"kubernetes.io/projected/96d50594-2379-4758-9882-7328d7bdf1fb-kube-api-access-bwq4h\") pod \"kube-storage-version-migrator-operator-b67b599dd-tt5pr\" (UID: \"96d50594-2379-4758-9882-7328d7bdf1fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.376062 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f69009ba-27a8-4518-b6d9-97194d8a77b3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kq6q9\" (UID: \"f69009ba-27a8-4518-b6d9-97194d8a77b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.376083 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d-cert\") pod \"ingress-canary-v476v\" (UID: \"11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d\") " pod="openshift-ingress-canary/ingress-canary-v476v" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.376103 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2c9sc\" (UID: \"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.376126 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vcv4\" (UniqueName: \"kubernetes.io/projected/c3b0b579-6db1-463a-b391-6799f284b89a-kube-api-access-8vcv4\") pod \"router-default-5444994796-2kvcm\" (UID: \"c3b0b579-6db1-463a-b391-6799f284b89a\") " pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.376187 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98cb9822-a3b6-48a8-93ce-1b619a9a8e7c-serving-cert\") pod \"authentication-operator-69f744f599-7727z\" (UID: \"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.376416 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.376642 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kmk94\" (UID: \"bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.376754 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5a670da2-9970-471b-93fb-7dce5fe66c94-available-featuregates\") pod \"openshift-config-operator-7777fb866f-cmwzb\" (UID: \"5a670da2-9970-471b-93fb-7dce5fe66c94\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.376841 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85db3d77-931a-4b82-90e7-acb77f874edc-auth-proxy-config\") pod \"machine-approver-56656f9798-b9nt9\" (UID: \"85db3d77-931a-4b82-90e7-acb77f874edc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.376863 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm7n6\" (UniqueName: \"kubernetes.io/projected/30a68e73-24db-4b31-a2d3-7d25eb88f0ff-kube-api-access-zm7n6\") pod \"service-ca-9c57cc56f-pzlxq\" (UID: \"30a68e73-24db-4b31-a2d3-7d25eb88f0ff\") " pod="openshift-service-ca/service-ca-9c57cc56f-pzlxq" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.376326 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/631034d3-e8b2-42a8-9566-8f8922464b56-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-49lk4\" (UID: \"631034d3-e8b2-42a8-9566-8f8922464b56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-49lk4" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.376926 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/85db3d77-931a-4b82-90e7-acb77f874edc-machine-approver-tls\") pod \"machine-approver-56656f9798-b9nt9\" (UID: \"85db3d77-931a-4b82-90e7-acb77f874edc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.377097 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljw4w\" (UniqueName: \"kubernetes.io/projected/10c98701-5de0-4c9b-a109-01a2985dc868-kube-api-access-ljw4w\") pod \"route-controller-manager-6576b87f9c-rzpzb\" (UID: \"10c98701-5de0-4c9b-a109-01a2985dc868\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.377158 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/744130db-de38-4c78-9684-95c04e411397-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ssjb6\" (UID: \"744130db-de38-4c78-9684-95c04e411397\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.377192 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8-images\") pod \"machine-api-operator-5694c8668f-7c8mj\" (UID: \"3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.377201 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-encryption-config\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.377257 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-trusted-ca-bundle\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.377289 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55c432b0-edb0-4988-994e-0a888a54621a-config\") pod \"service-ca-operator-777779d784-xgm55\" (UID: \"55c432b0-edb0-4988-994e-0a888a54621a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xgm55" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.377389 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-etcd-client\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.377422 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-config\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.377491 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qdpr\" (UniqueName: \"kubernetes.io/projected/d66ba69f-89dd-49ea-91f6-6682ce7bdc8a-kube-api-access-5qdpr\") pod \"packageserver-d55dfcdfc-8sxvh\" (UID: \"d66ba69f-89dd-49ea-91f6-6682ce7bdc8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.377521 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f69009ba-27a8-4518-b6d9-97194d8a77b3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kq6q9\" (UID: \"f69009ba-27a8-4518-b6d9-97194d8a77b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.377568 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58f0bd92-2b1f-4d7e-899f-556bbf8cdf00-trusted-ca\") pod \"console-operator-58897d9998-b46dw\" (UID: \"58f0bd92-2b1f-4d7e-899f-556bbf8cdf00\") " pod="openshift-console-operator/console-operator-58897d9998-b46dw" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.377587 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10c98701-5de0-4c9b-a109-01a2985dc868-serving-cert\") pod \"route-controller-manager-6576b87f9c-rzpzb\" (UID: \"10c98701-5de0-4c9b-a109-01a2985dc868\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.377711 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-service-ca\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.378004 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2c9sc\" (UID: \"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.378293 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e488a159-e6fd-4ede-af64-43b8b1f31e4f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-qc728\" (UID: \"e488a159-e6fd-4ede-af64-43b8b1f31e4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.378535 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8-images\") pod \"machine-api-operator-5694c8668f-7c8mj\" (UID: \"3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.378594 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/744130db-de38-4c78-9684-95c04e411397-proxy-tls\") pod \"machine-config-controller-84d6567774-ssjb6\" (UID: \"744130db-de38-4c78-9684-95c04e411397\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.378687 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-serving-cert\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.378840 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8-config\") pod \"machine-api-operator-5694c8668f-7c8mj\" (UID: \"3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.378879 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-trusted-ca-bundle\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.378904 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgbk9\" (UniqueName: \"kubernetes.io/projected/58f0bd92-2b1f-4d7e-899f-556bbf8cdf00-kube-api-access-dgbk9\") pod \"console-operator-58897d9998-b46dw\" (UID: \"58f0bd92-2b1f-4d7e-899f-556bbf8cdf00\") " pod="openshift-console-operator/console-operator-58897d9998-b46dw" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.378909 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/744130db-de38-4c78-9684-95c04e411397-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ssjb6\" (UID: \"744130db-de38-4c78-9684-95c04e411397\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.378994 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58f0bd92-2b1f-4d7e-899f-556bbf8cdf00-serving-cert\") pod \"console-operator-58897d9998-b46dw\" (UID: \"58f0bd92-2b1f-4d7e-899f-556bbf8cdf00\") " pod="openshift-console-operator/console-operator-58897d9998-b46dw" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.379044 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0555dc0e-5155-49f2-8c17-cfd091afacf9-trusted-ca\") pod \"ingress-operator-5b745b69d9-69994\" (UID: \"0555dc0e-5155-49f2-8c17-cfd091afacf9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.384069 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-serving-cert\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.385046 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/350fb173-d354-4879-9e3b-c2429a682c05-srv-cert\") pod \"catalog-operator-68c6474976-g89wb\" (UID: \"350fb173-d354-4879-9e3b-c2429a682c05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.385112 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a670da2-9970-471b-93fb-7dce5fe66c94-serving-cert\") pod \"openshift-config-operator-7777fb866f-cmwzb\" (UID: \"5a670da2-9970-471b-93fb-7dce5fe66c94\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.385617 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8-config\") pod \"machine-api-operator-5694c8668f-7c8mj\" (UID: \"3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.385625 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98cb9822-a3b6-48a8-93ce-1b619a9a8e7c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7727z\" (UID: \"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.385700 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f69009ba-27a8-4518-b6d9-97194d8a77b3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kq6q9\" (UID: \"f69009ba-27a8-4518-b6d9-97194d8a77b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.385911 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f69009ba-27a8-4518-b6d9-97194d8a77b3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kq6q9\" (UID: \"f69009ba-27a8-4518-b6d9-97194d8a77b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.386067 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/744130db-de38-4c78-9684-95c04e411397-proxy-tls\") pod \"machine-config-controller-84d6567774-ssjb6\" (UID: \"744130db-de38-4c78-9684-95c04e411397\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.386079 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0622ab0a-2b5b-4f71-a9b1-8573297d6351-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x9bfb\" (UID: \"0622ab0a-2b5b-4f71-a9b1-8573297d6351\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.386279 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58f0bd92-2b1f-4d7e-899f-556bbf8cdf00-trusted-ca\") pod \"console-operator-58897d9998-b46dw\" (UID: \"58f0bd92-2b1f-4d7e-899f-556bbf8cdf00\") " pod="openshift-console-operator/console-operator-58897d9998-b46dw" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.386370 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/350fb173-d354-4879-9e3b-c2429a682c05-profile-collector-cert\") pod \"catalog-operator-68c6474976-g89wb\" (UID: \"350fb173-d354-4879-9e3b-c2429a682c05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.386461 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/85db3d77-931a-4b82-90e7-acb77f874edc-machine-approver-tls\") pod \"machine-approver-56656f9798-b9nt9\" (UID: \"85db3d77-931a-4b82-90e7-acb77f874edc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.386569 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7c8mj\" (UID: \"3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.386671 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e488a159-e6fd-4ede-af64-43b8b1f31e4f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-qc728\" (UID: \"e488a159-e6fd-4ede-af64-43b8b1f31e4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.386860 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.387072 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-oauth-config\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.390452 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58f0bd92-2b1f-4d7e-899f-556bbf8cdf00-serving-cert\") pod \"console-operator-58897d9998-b46dw\" (UID: \"58f0bd92-2b1f-4d7e-899f-556bbf8cdf00\") " pod="openshift-console-operator/console-operator-58897d9998-b46dw" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.392532 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-etcd-client\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.417430 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.427615 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ef6cafd-6676-4a26-9b8f-96317dda91bf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cfdkz\" (UID: \"3ef6cafd-6676-4a26-9b8f-96317dda91bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.437306 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.450217 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/580f1600-09f3-46f0-8429-1846d2f46ba7-srv-cert\") pod \"olm-operator-6b444d44fb-zm2nv\" (UID: \"580f1600-09f3-46f0-8429-1846d2f46ba7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.457601 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.477931 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.487928 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55c432b0-edb0-4988-994e-0a888a54621a-config\") pod \"service-ca-operator-777779d784-xgm55\" (UID: \"55c432b0-edb0-4988-994e-0a888a54621a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xgm55" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.487979 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-config\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488002 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qdpr\" (UniqueName: \"kubernetes.io/projected/d66ba69f-89dd-49ea-91f6-6682ce7bdc8a-kube-api-access-5qdpr\") pod \"packageserver-d55dfcdfc-8sxvh\" (UID: \"d66ba69f-89dd-49ea-91f6-6682ce7bdc8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488031 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0555dc0e-5155-49f2-8c17-cfd091afacf9-trusted-ca\") pod \"ingress-operator-5b745b69d9-69994\" (UID: \"0555dc0e-5155-49f2-8c17-cfd091afacf9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488056 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g8x5\" (UniqueName: \"kubernetes.io/projected/63a3212f-beb2-4bc0-b88a-947b3a7653c2-kube-api-access-2g8x5\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488076 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/30d55e05-d66f-496a-b4e1-fb6deb38895f-audit-dir\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488094 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48zzt\" (UniqueName: \"kubernetes.io/projected/f1b36133-0f4e-49bf-b33d-224112b2a964-kube-api-access-48zzt\") pod \"migrator-59844c95c7-lmjwv\" (UID: \"f1b36133-0f4e-49bf-b33d-224112b2a964\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lmjwv" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488112 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488151 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zqh9\" (UniqueName: \"kubernetes.io/projected/3a3480f6-d49a-4f37-9f9a-605d9efc851e-kube-api-access-8zqh9\") pod \"multus-admission-controller-857f4d67dd-c4hb8\" (UID: \"3a3480f6-d49a-4f37-9f9a-605d9efc851e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c4hb8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488171 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488197 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0555dc0e-5155-49f2-8c17-cfd091afacf9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-69994\" (UID: \"0555dc0e-5155-49f2-8c17-cfd091afacf9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488215 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/623a789c-ee08-43de-b81c-8f1499ab8fbc-config\") pod \"kube-apiserver-operator-766d6c64bb-89tmm\" (UID: \"623a789c-ee08-43de-b81c-8f1499ab8fbc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488251 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8d9r\" (UniqueName: \"kubernetes.io/projected/3d3a8832-4745-4b77-b855-edea43d079d4-kube-api-access-q8d9r\") pod \"dns-operator-744455d44c-cbpn8\" (UID: \"3d3a8832-4745-4b77-b855-edea43d079d4\") " pod="openshift-dns-operator/dns-operator-744455d44c-cbpn8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488270 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a3480f6-d49a-4f37-9f9a-605d9efc851e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c4hb8\" (UID: \"3a3480f6-d49a-4f37-9f9a-605d9efc851e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c4hb8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488294 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55c432b0-edb0-4988-994e-0a888a54621a-serving-cert\") pod \"service-ca-operator-777779d784-xgm55\" (UID: \"55c432b0-edb0-4988-994e-0a888a54621a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xgm55" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488313 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae77852d-9558-4ceb-9eef-1b65bb912a92-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-k6bt9\" (UID: \"ae77852d-9558-4ceb-9eef-1b65bb912a92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488333 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d66ba69f-89dd-49ea-91f6-6682ce7bdc8a-webhook-cert\") pod \"packageserver-d55dfcdfc-8sxvh\" (UID: \"d66ba69f-89dd-49ea-91f6-6682ce7bdc8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488366 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/947012be-68b2-4b9a-b020-258b9b1f0ce8-proxy-tls\") pod \"machine-config-operator-74547568cd-w42b5\" (UID: \"947012be-68b2-4b9a-b020-258b9b1f0ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488394 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c3b0b579-6db1-463a-b391-6799f284b89a-default-certificate\") pod \"router-default-5444994796-2kvcm\" (UID: \"c3b0b579-6db1-463a-b391-6799f284b89a\") " pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488422 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2v7j\" (UniqueName: \"kubernetes.io/projected/ae77852d-9558-4ceb-9eef-1b65bb912a92-kube-api-access-p2v7j\") pod \"openshift-controller-manager-operator-756b6f6bc6-k6bt9\" (UID: \"ae77852d-9558-4ceb-9eef-1b65bb912a92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488448 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488473 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-ca\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488508 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488527 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-audit-policies\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488562 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488586 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdwlh\" (UniqueName: \"kubernetes.io/projected/947012be-68b2-4b9a-b020-258b9b1f0ce8-kube-api-access-wdwlh\") pod \"machine-config-operator-74547568cd-w42b5\" (UID: \"947012be-68b2-4b9a-b020-258b9b1f0ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488606 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488626 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/947012be-68b2-4b9a-b020-258b9b1f0ce8-images\") pod \"machine-config-operator-74547568cd-w42b5\" (UID: \"947012be-68b2-4b9a-b020-258b9b1f0ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488647 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxg5r\" (UniqueName: \"kubernetes.io/projected/55c432b0-edb0-4988-994e-0a888a54621a-kube-api-access-cxg5r\") pod \"service-ca-operator-777779d784-xgm55\" (UID: \"55c432b0-edb0-4988-994e-0a888a54621a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xgm55" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488686 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/30a68e73-24db-4b31-a2d3-7d25eb88f0ff-signing-key\") pod \"service-ca-9c57cc56f-pzlxq\" (UID: \"30a68e73-24db-4b31-a2d3-7d25eb88f0ff\") " pod="openshift-service-ca/service-ca-9c57cc56f-pzlxq" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488706 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488751 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3b0b579-6db1-463a-b391-6799f284b89a-metrics-certs\") pod \"router-default-5444994796-2kvcm\" (UID: \"c3b0b579-6db1-463a-b391-6799f284b89a\") " pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488775 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c3b0b579-6db1-463a-b391-6799f284b89a-stats-auth\") pod \"router-default-5444994796-2kvcm\" (UID: \"c3b0b579-6db1-463a-b391-6799f284b89a\") " pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488790 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d66ba69f-89dd-49ea-91f6-6682ce7bdc8a-apiservice-cert\") pod \"packageserver-d55dfcdfc-8sxvh\" (UID: \"d66ba69f-89dd-49ea-91f6-6682ce7bdc8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488817 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0555dc0e-5155-49f2-8c17-cfd091afacf9-metrics-tls\") pod \"ingress-operator-5b745b69d9-69994\" (UID: \"0555dc0e-5155-49f2-8c17-cfd091afacf9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488834 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwfhg\" (UniqueName: \"kubernetes.io/projected/0555dc0e-5155-49f2-8c17-cfd091afacf9-kube-api-access-qwfhg\") pod \"ingress-operator-5b745b69d9-69994\" (UID: \"0555dc0e-5155-49f2-8c17-cfd091afacf9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488854 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488870 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/623a789c-ee08-43de-b81c-8f1499ab8fbc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-89tmm\" (UID: \"623a789c-ee08-43de-b81c-8f1499ab8fbc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488892 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-client\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488911 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/623a789c-ee08-43de-b81c-8f1499ab8fbc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-89tmm\" (UID: \"623a789c-ee08-43de-b81c-8f1499ab8fbc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488956 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d3a8832-4745-4b77-b855-edea43d079d4-metrics-tls\") pod \"dns-operator-744455d44c-cbpn8\" (UID: \"3d3a8832-4745-4b77-b855-edea43d079d4\") " pod="openshift-dns-operator/dns-operator-744455d44c-cbpn8" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.488977 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc5kl\" (UniqueName: \"kubernetes.io/projected/30d55e05-d66f-496a-b4e1-fb6deb38895f-kube-api-access-wc5kl\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.489001 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.489049 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63a3212f-beb2-4bc0-b88a-947b3a7653c2-serving-cert\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.489067 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-service-ca\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.489083 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/30a68e73-24db-4b31-a2d3-7d25eb88f0ff-signing-cabundle\") pod \"service-ca-9c57cc56f-pzlxq\" (UID: \"30a68e73-24db-4b31-a2d3-7d25eb88f0ff\") " pod="openshift-service-ca/service-ca-9c57cc56f-pzlxq" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.489126 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d66ba69f-89dd-49ea-91f6-6682ce7bdc8a-tmpfs\") pod \"packageserver-d55dfcdfc-8sxvh\" (UID: \"d66ba69f-89dd-49ea-91f6-6682ce7bdc8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.489150 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.489159 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/30d55e05-d66f-496a-b4e1-fb6deb38895f-audit-dir\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.489184 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3b0b579-6db1-463a-b391-6799f284b89a-service-ca-bundle\") pod \"router-default-5444994796-2kvcm\" (UID: \"c3b0b579-6db1-463a-b391-6799f284b89a\") " pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.489328 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/947012be-68b2-4b9a-b020-258b9b1f0ce8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-w42b5\" (UID: \"947012be-68b2-4b9a-b020-258b9b1f0ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.489376 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5k9s\" (UniqueName: \"kubernetes.io/projected/11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d-kube-api-access-r5k9s\") pod \"ingress-canary-v476v\" (UID: \"11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d\") " pod="openshift-ingress-canary/ingress-canary-v476v" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.489456 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.489496 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae77852d-9558-4ceb-9eef-1b65bb912a92-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-k6bt9\" (UID: \"ae77852d-9558-4ceb-9eef-1b65bb912a92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.489535 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d-cert\") pod \"ingress-canary-v476v\" (UID: \"11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d\") " pod="openshift-ingress-canary/ingress-canary-v476v" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.489578 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vcv4\" (UniqueName: \"kubernetes.io/projected/c3b0b579-6db1-463a-b391-6799f284b89a-kube-api-access-8vcv4\") pod \"router-default-5444994796-2kvcm\" (UID: \"c3b0b579-6db1-463a-b391-6799f284b89a\") " pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.489656 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm7n6\" (UniqueName: \"kubernetes.io/projected/30a68e73-24db-4b31-a2d3-7d25eb88f0ff-kube-api-access-zm7n6\") pod \"service-ca-9c57cc56f-pzlxq\" (UID: \"30a68e73-24db-4b31-a2d3-7d25eb88f0ff\") " pod="openshift-service-ca/service-ca-9c57cc56f-pzlxq" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.490483 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/947012be-68b2-4b9a-b020-258b9b1f0ce8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-w42b5\" (UID: \"947012be-68b2-4b9a-b020-258b9b1f0ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.491011 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d66ba69f-89dd-49ea-91f6-6682ce7bdc8a-tmpfs\") pod \"packageserver-d55dfcdfc-8sxvh\" (UID: \"d66ba69f-89dd-49ea-91f6-6682ce7bdc8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.499261 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.506571 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0555dc0e-5155-49f2-8c17-cfd091afacf9-metrics-tls\") pod \"ingress-operator-5b745b69d9-69994\" (UID: \"0555dc0e-5155-49f2-8c17-cfd091afacf9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.526405 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.531377 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0555dc0e-5155-49f2-8c17-cfd091afacf9-trusted-ca\") pod \"ingress-operator-5b745b69d9-69994\" (UID: \"0555dc0e-5155-49f2-8c17-cfd091afacf9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.537511 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.557878 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.578474 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.584090 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c3b0b579-6db1-463a-b391-6799f284b89a-stats-auth\") pod \"router-default-5444994796-2kvcm\" (UID: \"c3b0b579-6db1-463a-b391-6799f284b89a\") " pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.599341 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.610278 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3b0b579-6db1-463a-b391-6799f284b89a-service-ca-bundle\") pod \"router-default-5444994796-2kvcm\" (UID: \"c3b0b579-6db1-463a-b391-6799f284b89a\") " pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.617620 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.623242 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3b0b579-6db1-463a-b391-6799f284b89a-metrics-certs\") pod \"router-default-5444994796-2kvcm\" (UID: \"c3b0b579-6db1-463a-b391-6799f284b89a\") " pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.638963 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.658706 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.665575 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c3b0b579-6db1-463a-b391-6799f284b89a-default-certificate\") pod \"router-default-5444994796-2kvcm\" (UID: \"c3b0b579-6db1-463a-b391-6799f284b89a\") " pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.677551 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.697800 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.719125 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.738503 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.746070 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae77852d-9558-4ceb-9eef-1b65bb912a92-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-k6bt9\" (UID: \"ae77852d-9558-4ceb-9eef-1b65bb912a92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.758635 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.761579 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae77852d-9558-4ceb-9eef-1b65bb912a92-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-k6bt9\" (UID: \"ae77852d-9558-4ceb-9eef-1b65bb912a92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.779041 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.798422 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.808517 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.809858 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.817869 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.820006 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6"] Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.838241 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.840552 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/947012be-68b2-4b9a-b020-258b9b1f0ce8-images\") pod \"machine-config-operator-74547568cd-w42b5\" (UID: \"947012be-68b2-4b9a-b020-258b9b1f0ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.859028 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.863598 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.884482 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.894381 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.897506 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.904264 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.917462 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.923575 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.937611 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.944798 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.958422 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.964185 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.972037 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.972587 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.972633 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.972498 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.978437 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 20:00:01 crc kubenswrapper[4807]: I1202 20:00:01.983285 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.001137 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.034986 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.037417 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.041115 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-audit-policies\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.043760 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.057574 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.061371 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.076882 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.081651 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.103172 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.112147 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.118688 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.151791 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25cgr\" (UniqueName: \"kubernetes.io/projected/b0dd45ac-1b9d-4274-96c1-98290213c989-kube-api-access-25cgr\") pod \"apiserver-76f77b778f-p49z8\" (UID: \"b0dd45ac-1b9d-4274-96c1-98290213c989\") " pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.170771 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz285\" (UniqueName: \"kubernetes.io/projected/af5c02f0-23a0-45e5-80ae-3510d6d908dc-kube-api-access-bz285\") pod \"controller-manager-879f6c89f-kdp2m\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.177798 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.182330 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/30a68e73-24db-4b31-a2d3-7d25eb88f0ff-signing-cabundle\") pod \"service-ca-9c57cc56f-pzlxq\" (UID: \"30a68e73-24db-4b31-a2d3-7d25eb88f0ff\") " pod="openshift-service-ca/service-ca-9c57cc56f-pzlxq" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.198242 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.217704 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.223663 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/30a68e73-24db-4b31-a2d3-7d25eb88f0ff-signing-key\") pod \"service-ca-9c57cc56f-pzlxq\" (UID: \"30a68e73-24db-4b31-a2d3-7d25eb88f0ff\") " pod="openshift-service-ca/service-ca-9c57cc56f-pzlxq" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.237922 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.257894 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.263341 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/947012be-68b2-4b9a-b020-258b9b1f0ce8-proxy-tls\") pod \"machine-config-operator-74547568cd-w42b5\" (UID: \"947012be-68b2-4b9a-b020-258b9b1f0ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.275608 4807 request.go:700] Waited for 1.002492804s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-operator-dockercfg-98p87&limit=500&resourceVersion=0 Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.277538 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.298279 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.312498 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.318466 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.324805 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.337942 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.346122 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3d3a8832-4745-4b77-b855-edea43d079d4-metrics-tls\") pod \"dns-operator-744455d44c-cbpn8\" (UID: \"3d3a8832-4745-4b77-b855-edea43d079d4\") " pod="openshift-dns-operator/dns-operator-744455d44c-cbpn8" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.382667 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.382903 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.394318 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d66ba69f-89dd-49ea-91f6-6682ce7bdc8a-webhook-cert\") pod \"packageserver-d55dfcdfc-8sxvh\" (UID: \"d66ba69f-89dd-49ea-91f6-6682ce7bdc8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.396082 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d66ba69f-89dd-49ea-91f6-6682ce7bdc8a-apiservice-cert\") pod \"packageserver-d55dfcdfc-8sxvh\" (UID: \"d66ba69f-89dd-49ea-91f6-6682ce7bdc8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.398323 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.405255 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/623a789c-ee08-43de-b81c-8f1499ab8fbc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-89tmm\" (UID: \"623a789c-ee08-43de-b81c-8f1499ab8fbc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.417507 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.438424 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.458011 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.460958 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/623a789c-ee08-43de-b81c-8f1499ab8fbc-config\") pod \"kube-apiserver-operator-766d6c64bb-89tmm\" (UID: \"623a789c-ee08-43de-b81c-8f1499ab8fbc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.477970 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.489958 4807 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490071 4807 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490103 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-config podName:63a3212f-beb2-4bc0-b88a-947b3a7653c2 nodeName:}" failed. No retries permitted until 2025-12-02 20:00:02.990072653 +0000 UTC m=+138.290980158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-config") pod "etcd-operator-b45778765-n9g5h" (UID: "63a3212f-beb2-4bc0-b88a-947b3a7653c2") : failed to sync configmap cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490235 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55c432b0-edb0-4988-994e-0a888a54621a-config podName:55c432b0-edb0-4988-994e-0a888a54621a nodeName:}" failed. No retries permitted until 2025-12-02 20:00:02.990202936 +0000 UTC m=+138.291110441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/55c432b0-edb0-4988-994e-0a888a54621a-config") pod "service-ca-operator-777779d784-xgm55" (UID: "55c432b0-edb0-4988-994e-0a888a54621a") : failed to sync configmap cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490267 4807 secret.go:188] Couldn't get secret openshift-etcd-operator/etcd-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490305 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63a3212f-beb2-4bc0-b88a-947b3a7653c2-serving-cert podName:63a3212f-beb2-4bc0-b88a-947b3a7653c2 nodeName:}" failed. No retries permitted until 2025-12-02 20:00:02.990296248 +0000 UTC m=+138.291203753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/63a3212f-beb2-4bc0-b88a-947b3a7653c2-serving-cert") pod "etcd-operator-b45778765-n9g5h" (UID: "63a3212f-beb2-4bc0-b88a-947b3a7653c2") : failed to sync secret cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490321 4807 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490373 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-service-ca podName:63a3212f-beb2-4bc0-b88a-947b3a7653c2 nodeName:}" failed. No retries permitted until 2025-12-02 20:00:02.990359309 +0000 UTC m=+138.291266854 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-service-ca" (UniqueName: "kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-service-ca") pod "etcd-operator-b45778765-n9g5h" (UID: "63a3212f-beb2-4bc0-b88a-947b3a7653c2") : failed to sync configmap cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490414 4807 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490455 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d-cert podName:11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d nodeName:}" failed. No retries permitted until 2025-12-02 20:00:02.990444271 +0000 UTC m=+138.291351856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d-cert") pod "ingress-canary-v476v" (UID: "11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d") : failed to sync secret cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490493 4807 secret.go:188] Couldn't get secret openshift-etcd-operator/etcd-client: failed to sync secret cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490532 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-client podName:63a3212f-beb2-4bc0-b88a-947b3a7653c2 nodeName:}" failed. No retries permitted until 2025-12-02 20:00:02.990521453 +0000 UTC m=+138.291429048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-client") pod "etcd-operator-b45778765-n9g5h" (UID: "63a3212f-beb2-4bc0-b88a-947b3a7653c2") : failed to sync secret cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490572 4807 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490595 4807 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490605 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-ca podName:63a3212f-beb2-4bc0-b88a-947b3a7653c2 nodeName:}" failed. No retries permitted until 2025-12-02 20:00:02.990594145 +0000 UTC m=+138.291501740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-ca" (UniqueName: "kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-ca") pod "etcd-operator-b45778765-n9g5h" (UID: "63a3212f-beb2-4bc0-b88a-947b3a7653c2") : failed to sync configmap cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490637 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55c432b0-edb0-4988-994e-0a888a54621a-serving-cert podName:55c432b0-edb0-4988-994e-0a888a54621a nodeName:}" failed. No retries permitted until 2025-12-02 20:00:02.990626426 +0000 UTC m=+138.291533931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/55c432b0-edb0-4988-994e-0a888a54621a-serving-cert") pod "service-ca-operator-777779d784-xgm55" (UID: "55c432b0-edb0-4988-994e-0a888a54621a") : failed to sync secret cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490636 4807 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: E1202 20:00:02.490681 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a3480f6-d49a-4f37-9f9a-605d9efc851e-webhook-certs podName:3a3480f6-d49a-4f37-9f9a-605d9efc851e nodeName:}" failed. No retries permitted until 2025-12-02 20:00:02.990670667 +0000 UTC m=+138.291578262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3a3480f6-d49a-4f37-9f9a-605d9efc851e-webhook-certs") pod "multus-admission-controller-857f4d67dd-c4hb8" (UID: "3a3480f6-d49a-4f37-9f9a-605d9efc851e") : failed to sync secret cache: timed out waiting for the condition Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.498244 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.517881 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.535607 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kdp2m"] Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.537508 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.557572 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.571345 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-p49z8"] Dec 02 20:00:02 crc kubenswrapper[4807]: W1202 20:00:02.577213 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0dd45ac_1b9d_4274_96c1_98290213c989.slice/crio-f1ff356ea202f25be275978575fe2dc448283d36d6345d6236c9bd827ec8bd22 WatchSource:0}: Error finding container f1ff356ea202f25be275978575fe2dc448283d36d6345d6236c9bd827ec8bd22: Status 404 returned error can't find the container with id f1ff356ea202f25be275978575fe2dc448283d36d6345d6236c9bd827ec8bd22 Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.577490 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.597658 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.617392 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.637347 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.658664 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.678260 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.697526 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.717377 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.737265 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.758661 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.777497 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.797765 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.824132 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.837936 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.851635 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-p49z8" event={"ID":"b0dd45ac-1b9d-4274-96c1-98290213c989","Type":"ContainerStarted","Data":"f1ff356ea202f25be275978575fe2dc448283d36d6345d6236c9bd827ec8bd22"} Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.853008 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" event={"ID":"af5c02f0-23a0-45e5-80ae-3510d6d908dc","Type":"ContainerStarted","Data":"0479ccaa47fd86eae3a777649432bb87da764ddd54cc20950bf3cf98bd540e34"} Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.858505 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.878009 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.897520 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.917346 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.937892 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.957759 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 02 20:00:02 crc kubenswrapper[4807]: I1202 20:00:02.998992 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.018520 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-client\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.018611 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-service-ca\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.018638 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63a3212f-beb2-4bc0-b88a-947b3a7653c2-serving-cert\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.018707 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d-cert\") pod \"ingress-canary-v476v\" (UID: \"11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d\") " pod="openshift-ingress-canary/ingress-canary-v476v" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.018814 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55c432b0-edb0-4988-994e-0a888a54621a-config\") pod \"service-ca-operator-777779d784-xgm55\" (UID: \"55c432b0-edb0-4988-994e-0a888a54621a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xgm55" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.018847 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-config\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.018895 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.018974 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a3480f6-d49a-4f37-9f9a-605d9efc851e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c4hb8\" (UID: \"3a3480f6-d49a-4f37-9f9a-605d9efc851e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c4hb8" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.019027 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55c432b0-edb0-4988-994e-0a888a54621a-serving-cert\") pod \"service-ca-operator-777779d784-xgm55\" (UID: \"55c432b0-edb0-4988-994e-0a888a54621a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xgm55" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.019089 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-ca\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.021190 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55c432b0-edb0-4988-994e-0a888a54621a-config\") pod \"service-ca-operator-777779d784-xgm55\" (UID: \"55c432b0-edb0-4988-994e-0a888a54621a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xgm55" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.024448 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55c432b0-edb0-4988-994e-0a888a54621a-serving-cert\") pod \"service-ca-operator-777779d784-xgm55\" (UID: \"55c432b0-edb0-4988-994e-0a888a54621a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xgm55" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.025066 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a3480f6-d49a-4f37-9f9a-605d9efc851e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c4hb8\" (UID: \"3a3480f6-d49a-4f37-9f9a-605d9efc851e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c4hb8" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.025245 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d-cert\") pod \"ingress-canary-v476v\" (UID: \"11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d\") " pod="openshift-ingress-canary/ingress-canary-v476v" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.038768 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.058780 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.078977 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.099021 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.118166 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.137612 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.158842 4807 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.207087 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp587\" (UniqueName: \"kubernetes.io/projected/5a670da2-9970-471b-93fb-7dce5fe66c94-kube-api-access-cp587\") pod \"openshift-config-operator-7777fb866f-cmwzb\" (UID: \"5a670da2-9970-471b-93fb-7dce5fe66c94\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.246250 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxw7p\" (UniqueName: \"kubernetes.io/projected/3ef6cafd-6676-4a26-9b8f-96317dda91bf-kube-api-access-wxw7p\") pod \"package-server-manager-789f6589d5-cfdkz\" (UID: \"3ef6cafd-6676-4a26-9b8f-96317dda91bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.275881 4807 request.go:700] Waited for 1.904819943s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/serviceaccounts/machine-config-controller/token Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.279684 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt7k7\" (UniqueName: \"kubernetes.io/projected/f69009ba-27a8-4518-b6d9-97194d8a77b3-kube-api-access-gt7k7\") pod \"cluster-image-registry-operator-dc59b4c8b-kq6q9\" (UID: \"f69009ba-27a8-4518-b6d9-97194d8a77b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.304377 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbdcz\" (UniqueName: \"kubernetes.io/projected/744130db-de38-4c78-9684-95c04e411397-kube-api-access-xbdcz\") pod \"machine-config-controller-84d6567774-ssjb6\" (UID: \"744130db-de38-4c78-9684-95c04e411397\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.314777 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj2lc\" (UniqueName: \"kubernetes.io/projected/85db3d77-931a-4b82-90e7-acb77f874edc-kube-api-access-bj2lc\") pod \"machine-approver-56656f9798-b9nt9\" (UID: \"85db3d77-931a-4b82-90e7-acb77f874edc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.333285 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb4cr\" (UniqueName: \"kubernetes.io/projected/631034d3-e8b2-42a8-9566-8f8922464b56-kube-api-access-zb4cr\") pod \"cluster-samples-operator-665b6dd947-49lk4\" (UID: \"631034d3-e8b2-42a8-9566-8f8922464b56\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-49lk4" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.358452 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqscq\" (UniqueName: \"kubernetes.io/projected/e488a159-e6fd-4ede-af64-43b8b1f31e4f-kube-api-access-kqscq\") pod \"openshift-apiserver-operator-796bbdcf4f-qc728\" (UID: \"e488a159-e6fd-4ede-af64-43b8b1f31e4f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.369291 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.375596 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-886d9\" (UniqueName: \"kubernetes.io/projected/b5784f10-10dd-4363-a50a-60d37b9c9ec5-kube-api-access-886d9\") pod \"downloads-7954f5f757-frmzm\" (UID: \"b5784f10-10dd-4363-a50a-60d37b9c9ec5\") " pod="openshift-console/downloads-7954f5f757-frmzm" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.393414 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dqrz\" (UniqueName: \"kubernetes.io/projected/b7c39998-d9b9-445b-8fa5-b338ccdfaf6f-kube-api-access-8dqrz\") pod \"apiserver-7bbb656c7d-6lrj2\" (UID: \"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.412162 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqbz2\" (UniqueName: \"kubernetes.io/projected/350fb173-d354-4879-9e3b-c2429a682c05-kube-api-access-qqbz2\") pod \"catalog-operator-68c6474976-g89wb\" (UID: \"350fb173-d354-4879-9e3b-c2429a682c05\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.432246 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcwpx\" (UniqueName: \"kubernetes.io/projected/617adf8b-8ca9-4578-8c24-8f6b22713567-kube-api-access-qcwpx\") pod \"control-plane-machine-set-operator-78cbb6b69f-n7sh5\" (UID: \"617adf8b-8ca9-4578-8c24-8f6b22713567\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n7sh5" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.434112 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.453019 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.461182 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.462148 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smj9s\" (UniqueName: \"kubernetes.io/projected/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-kube-api-access-smj9s\") pod \"console-f9d7485db-bjmsc\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.467216 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.471099 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8thbr\" (UniqueName: \"kubernetes.io/projected/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-kube-api-access-8thbr\") pod \"marketplace-operator-79b997595-2c9sc\" (UID: \"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.474817 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.496089 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw5hl\" (UniqueName: \"kubernetes.io/projected/3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8-kube-api-access-dw5hl\") pod \"machine-api-operator-5694c8668f-7c8mj\" (UID: \"3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.511293 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvp6f\" (UniqueName: \"kubernetes.io/projected/98cb9822-a3b6-48a8-93ce-1b619a9a8e7c-kube-api-access-kvp6f\") pod \"authentication-operator-69f744f599-7727z\" (UID: \"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.515188 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.532740 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0622ab0a-2b5b-4f71-a9b1-8573297d6351-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-x9bfb\" (UID: \"0622ab0a-2b5b-4f71-a9b1-8573297d6351\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.535756 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.537773 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f69009ba-27a8-4518-b6d9-97194d8a77b3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kq6q9\" (UID: \"f69009ba-27a8-4518-b6d9-97194d8a77b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.538198 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-ca\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.538374 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-config\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.538630 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-service-ca\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.538806 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mljmt\" (UniqueName: \"kubernetes.io/projected/580f1600-09f3-46f0-8429-1846d2f46ba7-kube-api-access-mljmt\") pod \"olm-operator-6b444d44fb-zm2nv\" (UID: \"580f1600-09f3-46f0-8429-1846d2f46ba7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.541231 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/63a3212f-beb2-4bc0-b88a-947b3a7653c2-etcd-client\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.541731 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63a3212f-beb2-4bc0-b88a-947b3a7653c2-serving-cert\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.542013 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.556370 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kmk94\" (UID: \"bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.566171 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.571303 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-frmzm" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.572206 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwq4h\" (UniqueName: \"kubernetes.io/projected/96d50594-2379-4758-9882-7328d7bdf1fb-kube-api-access-bwq4h\") pod \"kube-storage-version-migrator-operator-b67b599dd-tt5pr\" (UID: \"96d50594-2379-4758-9882-7328d7bdf1fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.593299 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-49lk4" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.595639 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljw4w\" (UniqueName: \"kubernetes.io/projected/10c98701-5de0-4c9b-a109-01a2985dc868-kube-api-access-ljw4w\") pod \"route-controller-manager-6576b87f9c-rzpzb\" (UID: \"10c98701-5de0-4c9b-a109-01a2985dc868\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.606557 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n7sh5" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.612319 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgbk9\" (UniqueName: \"kubernetes.io/projected/58f0bd92-2b1f-4d7e-899f-556bbf8cdf00-kube-api-access-dgbk9\") pod \"console-operator-58897d9998-b46dw\" (UID: \"58f0bd92-2b1f-4d7e-899f-556bbf8cdf00\") " pod="openshift-console-operator/console-operator-58897d9998-b46dw" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.628543 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.651381 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:00:03 crc kubenswrapper[4807]: W1202 20:00:03.651912 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85db3d77_931a_4b82_90e7_acb77f874edc.slice/crio-2b1cd133e51ddc189d0a4c92cbb4c750dd7bf7e0b88c7586735dfecbb6b2bfb6 WatchSource:0}: Error finding container 2b1cd133e51ddc189d0a4c92cbb4c750dd7bf7e0b88c7586735dfecbb6b2bfb6: Status 404 returned error can't find the container with id 2b1cd133e51ddc189d0a4c92cbb4c750dd7bf7e0b88c7586735dfecbb6b2bfb6 Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.659201 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0555dc0e-5155-49f2-8c17-cfd091afacf9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-69994\" (UID: \"0555dc0e-5155-49f2-8c17-cfd091afacf9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.679911 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qdpr\" (UniqueName: \"kubernetes.io/projected/d66ba69f-89dd-49ea-91f6-6682ce7bdc8a-kube-api-access-5qdpr\") pod \"packageserver-d55dfcdfc-8sxvh\" (UID: \"d66ba69f-89dd-49ea-91f6-6682ce7bdc8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.691489 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g8x5\" (UniqueName: \"kubernetes.io/projected/63a3212f-beb2-4bc0-b88a-947b3a7653c2-kube-api-access-2g8x5\") pod \"etcd-operator-b45778765-n9g5h\" (UID: \"63a3212f-beb2-4bc0-b88a-947b3a7653c2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.698781 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.715348 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.715963 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.721876 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.728830 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8d9r\" (UniqueName: \"kubernetes.io/projected/3d3a8832-4745-4b77-b855-edea43d079d4-kube-api-access-q8d9r\") pod \"dns-operator-744455d44c-cbpn8\" (UID: \"3d3a8832-4745-4b77-b855-edea43d079d4\") " pod="openshift-dns-operator/dns-operator-744455d44c-cbpn8" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.737528 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-b46dw" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.738076 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxg5r\" (UniqueName: \"kubernetes.io/projected/55c432b0-edb0-4988-994e-0a888a54621a-kube-api-access-cxg5r\") pod \"service-ca-operator-777779d784-xgm55\" (UID: \"55c432b0-edb0-4988-994e-0a888a54621a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xgm55" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.759596 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwfhg\" (UniqueName: \"kubernetes.io/projected/0555dc0e-5155-49f2-8c17-cfd091afacf9-kube-api-access-qwfhg\") pod \"ingress-operator-5b745b69d9-69994\" (UID: \"0555dc0e-5155-49f2-8c17-cfd091afacf9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.777175 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm7n6\" (UniqueName: \"kubernetes.io/projected/30a68e73-24db-4b31-a2d3-7d25eb88f0ff-kube-api-access-zm7n6\") pod \"service-ca-9c57cc56f-pzlxq\" (UID: \"30a68e73-24db-4b31-a2d3-7d25eb88f0ff\") " pod="openshift-service-ca/service-ca-9c57cc56f-pzlxq" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.792843 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2v7j\" (UniqueName: \"kubernetes.io/projected/ae77852d-9558-4ceb-9eef-1b65bb912a92-kube-api-access-p2v7j\") pod \"openshift-controller-manager-operator-756b6f6bc6-k6bt9\" (UID: \"ae77852d-9558-4ceb-9eef-1b65bb912a92\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.813327 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5k9s\" (UniqueName: \"kubernetes.io/projected/11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d-kube-api-access-r5k9s\") pod \"ingress-canary-v476v\" (UID: \"11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d\") " pod="openshift-ingress-canary/ingress-canary-v476v" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.821857 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.836191 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48zzt\" (UniqueName: \"kubernetes.io/projected/f1b36133-0f4e-49bf-b33d-224112b2a964-kube-api-access-48zzt\") pod \"migrator-59844c95c7-lmjwv\" (UID: \"f1b36133-0f4e-49bf-b33d-224112b2a964\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lmjwv" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.854172 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdwlh\" (UniqueName: \"kubernetes.io/projected/947012be-68b2-4b9a-b020-258b9b1f0ce8-kube-api-access-wdwlh\") pod \"machine-config-operator-74547568cd-w42b5\" (UID: \"947012be-68b2-4b9a-b020-258b9b1f0ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.864730 4807 generic.go:334] "Generic (PLEG): container finished" podID="b0dd45ac-1b9d-4274-96c1-98290213c989" containerID="af63d0ac0064c7bacd2e28674eb0db1a1fb80730d85056d56d18180f9a9188a6" exitCode=0 Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.864820 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-p49z8" event={"ID":"b0dd45ac-1b9d-4274-96c1-98290213c989","Type":"ContainerDied","Data":"af63d0ac0064c7bacd2e28674eb0db1a1fb80730d85056d56d18180f9a9188a6"} Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.870243 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" event={"ID":"af5c02f0-23a0-45e5-80ae-3510d6d908dc","Type":"ContainerStarted","Data":"94b4bd9bd141f1e493e198941d44f0cae91e20feb5b8883f09aa143a34f25472"} Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.870789 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.871564 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.871706 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" event={"ID":"85db3d77-931a-4b82-90e7-acb77f874edc","Type":"ContainerStarted","Data":"2b1cd133e51ddc189d0a4c92cbb4c750dd7bf7e0b88c7586735dfecbb6b2bfb6"} Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.876006 4807 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-kdp2m container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.876056 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" podUID="af5c02f0-23a0-45e5-80ae-3510d6d908dc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.876570 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/623a789c-ee08-43de-b81c-8f1499ab8fbc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-89tmm\" (UID: \"623a789c-ee08-43de-b81c-8f1499ab8fbc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.885930 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.890999 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vcv4\" (UniqueName: \"kubernetes.io/projected/c3b0b579-6db1-463a-b391-6799f284b89a-kube-api-access-8vcv4\") pod \"router-default-5444994796-2kvcm\" (UID: \"c3b0b579-6db1-463a-b391-6799f284b89a\") " pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.910154 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-pzlxq" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.912188 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc5kl\" (UniqueName: \"kubernetes.io/projected/30d55e05-d66f-496a-b4e1-fb6deb38895f-kube-api-access-wc5kl\") pod \"oauth-openshift-558db77b4-v4lp2\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.916265 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.925615 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-cbpn8" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.937364 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.939576 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zqh9\" (UniqueName: \"kubernetes.io/projected/3a3480f6-d49a-4f37-9f9a-605d9efc851e-kube-api-access-8zqh9\") pod \"multus-admission-controller-857f4d67dd-c4hb8\" (UID: \"3a3480f6-d49a-4f37-9f9a-605d9efc851e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c4hb8" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.942244 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.944877 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.954097 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.957697 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.959958 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xgm55" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.986327 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lmjwv" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.989937 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 02 20:00:03 crc kubenswrapper[4807]: I1202 20:00:03.994166 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-c4hb8" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:03.998588 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.004321 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v476v" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.005085 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-bjmsc"] Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.005123 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb"] Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.018807 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.038235 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.057162 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.084122 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 02 20:00:04 crc kubenswrapper[4807]: W1202 20:00:04.107972 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a670da2_9970_471b_93fb_7dce5fe66c94.slice/crio-925f5199d2dd1191b913aff5212577a1533073c01aa83900ca36727548483be9 WatchSource:0}: Error finding container 925f5199d2dd1191b913aff5212577a1533073c01aa83900ca36727548483be9: Status 404 returned error can't find the container with id 925f5199d2dd1191b913aff5212577a1533073c01aa83900ca36727548483be9 Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.147375 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7mbs\" (UniqueName: \"kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-kube-api-access-k7mbs\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.147437 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.147607 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8f3efef-a103-4813-94ed-1c9bd0113f84-trusted-ca\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: E1202 20:00:04.147742 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:04.647730158 +0000 UTC m=+139.948637653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.147809 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-bound-sa-token\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.147887 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-registry-tls\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.147906 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8f3efef-a103-4813-94ed-1c9bd0113f84-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.147921 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8f3efef-a103-4813-94ed-1c9bd0113f84-registry-certificates\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.148042 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8f3efef-a103-4813-94ed-1c9bd0113f84-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.178051 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.201259 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.260590 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.261082 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8f3efef-a103-4813-94ed-1c9bd0113f84-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.269040 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-plugins-dir\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.269112 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-socket-dir\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.269215 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mpk8\" (UniqueName: \"kubernetes.io/projected/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-kube-api-access-6mpk8\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.269235 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dcbd369-4c3b-4d52-b93f-a2d2e1947163-metrics-tls\") pod \"dns-default-b8g8z\" (UID: \"5dcbd369-4c3b-4d52-b93f-a2d2e1947163\") " pod="openshift-dns/dns-default-b8g8z" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.269292 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7mbs\" (UniqueName: \"kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-kube-api-access-k7mbs\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.269454 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-mountpoint-dir\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.269474 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-csi-data-dir\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.269557 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ms6l\" (UniqueName: \"kubernetes.io/projected/5dcbd369-4c3b-4d52-b93f-a2d2e1947163-kube-api-access-7ms6l\") pod \"dns-default-b8g8z\" (UID: \"5dcbd369-4c3b-4d52-b93f-a2d2e1947163\") " pod="openshift-dns/dns-default-b8g8z" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.269767 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8f3efef-a103-4813-94ed-1c9bd0113f84-trusted-ca\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.269792 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/73115f00-4586-4157-86c0-332528cea752-node-bootstrap-token\") pod \"machine-config-server-dsdtv\" (UID: \"73115f00-4586-4157-86c0-332528cea752\") " pod="openshift-machine-config-operator/machine-config-server-dsdtv" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.269922 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-registration-dir\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.269952 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49rp\" (UniqueName: \"kubernetes.io/projected/73115f00-4586-4157-86c0-332528cea752-kube-api-access-b49rp\") pod \"machine-config-server-dsdtv\" (UID: \"73115f00-4586-4157-86c0-332528cea752\") " pod="openshift-machine-config-operator/machine-config-server-dsdtv" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.270008 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/73115f00-4586-4157-86c0-332528cea752-certs\") pod \"machine-config-server-dsdtv\" (UID: \"73115f00-4586-4157-86c0-332528cea752\") " pod="openshift-machine-config-operator/machine-config-server-dsdtv" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.270075 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bc7f08d-6984-4bab-9220-761b68fdec0d-config-volume\") pod \"collect-profiles-29411760-dv4c6\" (UID: \"3bc7f08d-6984-4bab-9220-761b68fdec0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.270091 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bc7f08d-6984-4bab-9220-761b68fdec0d-secret-volume\") pod \"collect-profiles-29411760-dv4c6\" (UID: \"3bc7f08d-6984-4bab-9220-761b68fdec0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.270559 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-bound-sa-token\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.270623 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfvpb\" (UniqueName: \"kubernetes.io/projected/3bc7f08d-6984-4bab-9220-761b68fdec0d-kube-api-access-rfvpb\") pod \"collect-profiles-29411760-dv4c6\" (UID: \"3bc7f08d-6984-4bab-9220-761b68fdec0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.270642 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5dcbd369-4c3b-4d52-b93f-a2d2e1947163-config-volume\") pod \"dns-default-b8g8z\" (UID: \"5dcbd369-4c3b-4d52-b93f-a2d2e1947163\") " pod="openshift-dns/dns-default-b8g8z" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.285660 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8f3efef-a103-4813-94ed-1c9bd0113f84-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.309144 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8f3efef-a103-4813-94ed-1c9bd0113f84-trusted-ca\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.322306 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-registry-tls\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.322441 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8f3efef-a103-4813-94ed-1c9bd0113f84-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.322475 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8f3efef-a103-4813-94ed-1c9bd0113f84-registry-certificates\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: E1202 20:00:04.323302 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:04.823260392 +0000 UTC m=+140.124167877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.323637 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8f3efef-a103-4813-94ed-1c9bd0113f84-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.324208 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8f3efef-a103-4813-94ed-1c9bd0113f84-registry-certificates\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.351787 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728"] Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.371084 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-bound-sa-token\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.371787 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-registry-tls\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.371839 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7mbs\" (UniqueName: \"kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-kube-api-access-k7mbs\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.377188 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2"] Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.424506 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-plugins-dir\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.424539 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-socket-dir\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.424558 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mpk8\" (UniqueName: \"kubernetes.io/projected/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-kube-api-access-6mpk8\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.424582 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dcbd369-4c3b-4d52-b93f-a2d2e1947163-metrics-tls\") pod \"dns-default-b8g8z\" (UID: \"5dcbd369-4c3b-4d52-b93f-a2d2e1947163\") " pod="openshift-dns/dns-default-b8g8z" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.424621 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.424640 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-mountpoint-dir\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.424655 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-csi-data-dir\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.424680 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ms6l\" (UniqueName: \"kubernetes.io/projected/5dcbd369-4c3b-4d52-b93f-a2d2e1947163-kube-api-access-7ms6l\") pod \"dns-default-b8g8z\" (UID: \"5dcbd369-4c3b-4d52-b93f-a2d2e1947163\") " pod="openshift-dns/dns-default-b8g8z" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.424702 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/73115f00-4586-4157-86c0-332528cea752-node-bootstrap-token\") pod \"machine-config-server-dsdtv\" (UID: \"73115f00-4586-4157-86c0-332528cea752\") " pod="openshift-machine-config-operator/machine-config-server-dsdtv" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.424747 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-registration-dir\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.424772 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49rp\" (UniqueName: \"kubernetes.io/projected/73115f00-4586-4157-86c0-332528cea752-kube-api-access-b49rp\") pod \"machine-config-server-dsdtv\" (UID: \"73115f00-4586-4157-86c0-332528cea752\") " pod="openshift-machine-config-operator/machine-config-server-dsdtv" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.424789 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/73115f00-4586-4157-86c0-332528cea752-certs\") pod \"machine-config-server-dsdtv\" (UID: \"73115f00-4586-4157-86c0-332528cea752\") " pod="openshift-machine-config-operator/machine-config-server-dsdtv" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.424806 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bc7f08d-6984-4bab-9220-761b68fdec0d-config-volume\") pod \"collect-profiles-29411760-dv4c6\" (UID: \"3bc7f08d-6984-4bab-9220-761b68fdec0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.424820 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bc7f08d-6984-4bab-9220-761b68fdec0d-secret-volume\") pod \"collect-profiles-29411760-dv4c6\" (UID: \"3bc7f08d-6984-4bab-9220-761b68fdec0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.424847 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfvpb\" (UniqueName: \"kubernetes.io/projected/3bc7f08d-6984-4bab-9220-761b68fdec0d-kube-api-access-rfvpb\") pod \"collect-profiles-29411760-dv4c6\" (UID: \"3bc7f08d-6984-4bab-9220-761b68fdec0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.424863 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5dcbd369-4c3b-4d52-b93f-a2d2e1947163-config-volume\") pod \"dns-default-b8g8z\" (UID: \"5dcbd369-4c3b-4d52-b93f-a2d2e1947163\") " pod="openshift-dns/dns-default-b8g8z" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.425354 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-plugins-dir\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.425427 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-socket-dir\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.425750 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-registration-dir\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.426277 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5dcbd369-4c3b-4d52-b93f-a2d2e1947163-config-volume\") pod \"dns-default-b8g8z\" (UID: \"5dcbd369-4c3b-4d52-b93f-a2d2e1947163\") " pod="openshift-dns/dns-default-b8g8z" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.426339 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-mountpoint-dir\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: E1202 20:00:04.426639 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:04.926627381 +0000 UTC m=+140.227534876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.427017 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-csi-data-dir\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.427701 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bc7f08d-6984-4bab-9220-761b68fdec0d-config-volume\") pod \"collect-profiles-29411760-dv4c6\" (UID: \"3bc7f08d-6984-4bab-9220-761b68fdec0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.429459 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/73115f00-4586-4157-86c0-332528cea752-node-bootstrap-token\") pod \"machine-config-server-dsdtv\" (UID: \"73115f00-4586-4157-86c0-332528cea752\") " pod="openshift-machine-config-operator/machine-config-server-dsdtv" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.436387 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bc7f08d-6984-4bab-9220-761b68fdec0d-secret-volume\") pod \"collect-profiles-29411760-dv4c6\" (UID: \"3bc7f08d-6984-4bab-9220-761b68fdec0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.455396 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/73115f00-4586-4157-86c0-332528cea752-certs\") pod \"machine-config-server-dsdtv\" (UID: \"73115f00-4586-4157-86c0-332528cea752\") " pod="openshift-machine-config-operator/machine-config-server-dsdtv" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.457018 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dcbd369-4c3b-4d52-b93f-a2d2e1947163-metrics-tls\") pod \"dns-default-b8g8z\" (UID: \"5dcbd369-4c3b-4d52-b93f-a2d2e1947163\") " pod="openshift-dns/dns-default-b8g8z" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.480090 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ms6l\" (UniqueName: \"kubernetes.io/projected/5dcbd369-4c3b-4d52-b93f-a2d2e1947163-kube-api-access-7ms6l\") pod \"dns-default-b8g8z\" (UID: \"5dcbd369-4c3b-4d52-b93f-a2d2e1947163\") " pod="openshift-dns/dns-default-b8g8z" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.517629 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mpk8\" (UniqueName: \"kubernetes.io/projected/bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df-kube-api-access-6mpk8\") pod \"csi-hostpathplugin-6j8qw\" (UID: \"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df\") " pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.526494 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.526622 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49rp\" (UniqueName: \"kubernetes.io/projected/73115f00-4586-4157-86c0-332528cea752-kube-api-access-b49rp\") pod \"machine-config-server-dsdtv\" (UID: \"73115f00-4586-4157-86c0-332528cea752\") " pod="openshift-machine-config-operator/machine-config-server-dsdtv" Dec 02 20:00:04 crc kubenswrapper[4807]: E1202 20:00:04.526709 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:05.026680652 +0000 UTC m=+140.327588147 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.526858 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: E1202 20:00:04.527196 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:05.027184654 +0000 UTC m=+140.328092149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.552261 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfvpb\" (UniqueName: \"kubernetes.io/projected/3bc7f08d-6984-4bab-9220-761b68fdec0d-kube-api-access-rfvpb\") pod \"collect-profiles-29411760-dv4c6\" (UID: \"3bc7f08d-6984-4bab-9220-761b68fdec0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.610285 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-b8g8z" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.619446 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dsdtv" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.627564 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:04 crc kubenswrapper[4807]: E1202 20:00:04.627993 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:05.127975032 +0000 UTC m=+140.428882527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.639422 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.764189 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: E1202 20:00:04.764987 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:05.264973331 +0000 UTC m=+140.565880826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.832683 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.865622 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:04 crc kubenswrapper[4807]: E1202 20:00:04.866012 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:05.365994744 +0000 UTC m=+140.666902239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.866067 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:04 crc kubenswrapper[4807]: E1202 20:00:04.866364 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:05.366358273 +0000 UTC m=+140.667265758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.884837 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" event={"ID":"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f","Type":"ContainerStarted","Data":"76a4dc1b7feada3c3c24818c8ae30c9a8df1b8c3f2e538c27d6dd3b5d4fa9dbf"} Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.886370 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728" event={"ID":"e488a159-e6fd-4ede-af64-43b8b1f31e4f","Type":"ContainerStarted","Data":"97c1f20017a6619b4f99aade1ecbc6e4fddd26c5781726de9a2113d9eb71a3a6"} Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.887700 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" event={"ID":"5a670da2-9970-471b-93fb-7dce5fe66c94","Type":"ContainerStarted","Data":"925f5199d2dd1191b913aff5212577a1533073c01aa83900ca36727548483be9"} Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.893955 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-p49z8" event={"ID":"b0dd45ac-1b9d-4274-96c1-98290213c989","Type":"ContainerStarted","Data":"b74f981ce8f43464a40e02b3d74f852e66df5629672e96ff5f5123272029ac4c"} Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.905076 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" event={"ID":"85db3d77-931a-4b82-90e7-acb77f874edc","Type":"ContainerStarted","Data":"94058bb38e0256514b9e169b21a200f1110c81d1435694e898b8847757e32030"} Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.909657 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-2kvcm" event={"ID":"c3b0b579-6db1-463a-b391-6799f284b89a","Type":"ContainerStarted","Data":"705696033dbc021eed8d88519fe3ede50fb5a238315e16228b056f39db662dba"} Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.911743 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bjmsc" event={"ID":"d31ed2df-d4fa-4b71-a218-20d453f1d8cb","Type":"ContainerStarted","Data":"672c8f69b1e38276e29a838b77aff72f952ebeb40ba4d0dfaa187b3dfc318d38"} Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.917472 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dsdtv" event={"ID":"73115f00-4586-4157-86c0-332528cea752","Type":"ContainerStarted","Data":"34828760ef4dd57a520161ca0b4fefcc16ecca42b82f7a1991d57402ed00072b"} Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.928127 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:00:04 crc kubenswrapper[4807]: I1202 20:00:04.967889 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:04 crc kubenswrapper[4807]: E1202 20:00:04.968259 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:05.468236016 +0000 UTC m=+140.769143511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.070279 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:05 crc kubenswrapper[4807]: E1202 20:00:05.070792 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:05.570777086 +0000 UTC m=+140.871684581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.171237 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:05 crc kubenswrapper[4807]: E1202 20:00:05.171682 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:05.671665216 +0000 UTC m=+140.972572711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.197111 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv"] Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.211469 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-frmzm"] Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.235967 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6"] Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.249449 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-49lk4"] Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.272808 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:05 crc kubenswrapper[4807]: E1202 20:00:05.273117 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:05.773105289 +0000 UTC m=+141.074012784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.379698 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:05 crc kubenswrapper[4807]: E1202 20:00:05.380502 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:05.880483422 +0000 UTC m=+141.181390917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.482676 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:05 crc kubenswrapper[4807]: E1202 20:00:05.483063 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:05.983050022 +0000 UTC m=+141.283957517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.516733 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz"] Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.543133 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb"] Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.545101 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb"] Dec 02 20:00:05 crc kubenswrapper[4807]: W1202 20:00:05.546897 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ef6cafd_6676_4a26_9b8f_96317dda91bf.slice/crio-7b3b08d5b3a0006945eb6fdab5dfe50c7da0b4d048ea23003a6f92b9247e3b83 WatchSource:0}: Error finding container 7b3b08d5b3a0006945eb6fdab5dfe50c7da0b4d048ea23003a6f92b9247e3b83: Status 404 returned error can't find the container with id 7b3b08d5b3a0006945eb6fdab5dfe50c7da0b4d048ea23003a6f92b9247e3b83 Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.547148 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2c9sc"] Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.557780 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb"] Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.563043 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9"] Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.568755 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n7sh5"] Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.583306 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:05 crc kubenswrapper[4807]: E1202 20:00:05.583660 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:06.083644435 +0000 UTC m=+141.384551930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.684677 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:05 crc kubenswrapper[4807]: E1202 20:00:05.685044 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:06.185030677 +0000 UTC m=+141.485938172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.756924 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" podStartSLOduration=121.756909246 podStartE2EDuration="2m1.756909246s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:05.756626379 +0000 UTC m=+141.057533894" watchObservedRunningTime="2025-12-02 20:00:05.756909246 +0000 UTC m=+141.057816741" Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.798219 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:05 crc kubenswrapper[4807]: E1202 20:00:05.798641 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:06.298626686 +0000 UTC m=+141.599534181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.899659 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:05 crc kubenswrapper[4807]: E1202 20:00:05.900295 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:06.400280705 +0000 UTC m=+141.701188200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.970439 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7c8mj"] Dec 02 20:00:05 crc kubenswrapper[4807]: I1202 20:00:05.973013 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:05.999662 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-p49z8" event={"ID":"b0dd45ac-1b9d-4274-96c1-98290213c989","Type":"ContainerStarted","Data":"4cb50b8099a5b914aefdb874d928a68fd31cdb659292fe8dfa0e7dc7e12d89da"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.010471 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c4hb8"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.011095 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:06 crc kubenswrapper[4807]: E1202 20:00:06.011468 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:06.511284173 +0000 UTC m=+141.812191668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.056562 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:06 crc kubenswrapper[4807]: E1202 20:00:06.057021 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:06.557004787 +0000 UTC m=+141.857912282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.065457 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" event={"ID":"10c98701-5de0-4c9b-a109-01a2985dc868","Type":"ContainerStarted","Data":"491ce146d958b78a2d4b9e7f0f29d2e12f0651ac144c38915fb676b3b2e55af4"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.065506 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" event={"ID":"10c98701-5de0-4c9b-a109-01a2985dc868","Type":"ContainerStarted","Data":"8bff21502f379b82383a01094aab4ef1552bbc6054025805aa10bbfe2900a0aa"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.068695 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.071049 4807 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-rzpzb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.071178 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" podUID="10c98701-5de0-4c9b-a109-01a2985dc868" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.101789 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" event={"ID":"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c","Type":"ContainerStarted","Data":"7fee9f323894aabc80d70b6efd6262707b81c986b82e99c8cd79d022dc229e88"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.105816 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7727z"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.130047 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xgm55"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.139890 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-69994"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.139947 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-n9g5h"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.143971 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-pzlxq"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.148104 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-b46dw"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.150111 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.157978 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:06 crc kubenswrapper[4807]: E1202 20:00:06.158890 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:06.65885786 +0000 UTC m=+141.959765355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.171490 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.172412 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6" event={"ID":"744130db-de38-4c78-9684-95c04e411397","Type":"ContainerStarted","Data":"a43b270e74e9ed61aeb189710487ff7d0d4a20e2f6ca31742b03f6adf20c067a"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.172482 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6" event={"ID":"744130db-de38-4c78-9684-95c04e411397","Type":"ContainerStarted","Data":"6c56da38f6f24364322b33234926f3e315fad266f7bdbda3cd37c75f52b03227"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.194093 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.204108 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-lmjwv"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.207827 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v4lp2"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.210904 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-frmzm" event={"ID":"b5784f10-10dd-4363-a50a-60d37b9c9ec5","Type":"ContainerStarted","Data":"58f5f13a88cc27366b6632de18cf895682d81a092ea5f984f248fcd4d78d76fc"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.210940 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-frmzm" event={"ID":"b5784f10-10dd-4363-a50a-60d37b9c9ec5","Type":"ContainerStarted","Data":"4f8b1082a7712e53f4d5128e8248690de5b1bfb3f913027ef88390982f3f7928"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.214161 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-frmzm" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.223901 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-cbpn8"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.238549 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-b8g8z"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.238887 4807 patch_prober.go:28] interesting pod/downloads-7954f5f757-frmzm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.239112 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-frmzm" podUID="b5784f10-10dd-4363-a50a-60d37b9c9ec5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.240705 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.243152 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bjmsc" event={"ID":"d31ed2df-d4fa-4b71-a218-20d453f1d8cb","Type":"ContainerStarted","Data":"b6b6454cfca2064541c8070f2153b92d971bf00df9198255955fc29f2945da5e"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.248892 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.252097 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb" event={"ID":"0622ab0a-2b5b-4f71-a9b1-8573297d6351","Type":"ContainerStarted","Data":"5ba7230c73547cd9dbefbfcb29655f383c17e37a9aa0f1849e3d3d0b22ccbaa4"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.255679 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.262577 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6j8qw"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.263344 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:06 crc kubenswrapper[4807]: E1202 20:00:06.264221 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:06.764205795 +0000 UTC m=+142.065113290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.281330 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v476v"] Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.290052 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" event={"ID":"580f1600-09f3-46f0-8429-1846d2f46ba7","Type":"ContainerStarted","Data":"fcdb4845eb3dadde3137b5e58d3d95dd4be3c5d4b6090bc32c9f8142b216975b"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.290103 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" event={"ID":"580f1600-09f3-46f0-8429-1846d2f46ba7","Type":"ContainerStarted","Data":"56eb3f89032ab04843c97ae640694bc9393e81dd9a79f8d062ed812544c27ebb"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.291076 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.302583 4807 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-zm2nv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.302871 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" podUID="580f1600-09f3-46f0-8429-1846d2f46ba7" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.318916 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-49lk4" event={"ID":"631034d3-e8b2-42a8-9566-8f8922464b56","Type":"ContainerStarted","Data":"01702c8981bba396efe3896ea64d7014544408414e52630561bb2921fa810b4e"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.351111 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728" event={"ID":"e488a159-e6fd-4ede-af64-43b8b1f31e4f","Type":"ContainerStarted","Data":"5c2c8b99165a5fd315033c380db021ab83ba577f85c9f48d4cacf3eb4f3cbb53"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.371032 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:06 crc kubenswrapper[4807]: E1202 20:00:06.388030 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:06.887987673 +0000 UTC m=+142.188895168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.429331 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" podStartSLOduration=121.429314384 podStartE2EDuration="2m1.429314384s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:06.396709018 +0000 UTC m=+141.697616513" watchObservedRunningTime="2025-12-02 20:00:06.429314384 +0000 UTC m=+141.730221879" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.441870 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-frmzm" podStartSLOduration=122.441849089 podStartE2EDuration="2m2.441849089s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:06.428973366 +0000 UTC m=+141.729880861" watchObservedRunningTime="2025-12-02 20:00:06.441849089 +0000 UTC m=+141.742756574" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.456305 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" event={"ID":"85db3d77-931a-4b82-90e7-acb77f874edc","Type":"ContainerStarted","Data":"5a212b67b242f054a7786e2e56e52755400352239cb66214c7b68505d6dcf465"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.463618 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-p49z8" podStartSLOduration=122.46358687 podStartE2EDuration="2m2.46358687s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:06.463443826 +0000 UTC m=+141.764351321" watchObservedRunningTime="2025-12-02 20:00:06.46358687 +0000 UTC m=+141.764494365" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.474001 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n7sh5" event={"ID":"617adf8b-8ca9-4578-8c24-8f6b22713567","Type":"ContainerStarted","Data":"e166ec7fa1bf31209421a8cf77df5961122b587ee9431e2c2a44047067bae0c7"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.482700 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:06 crc kubenswrapper[4807]: E1202 20:00:06.483932 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:06.983919197 +0000 UTC m=+142.284826692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.510453 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-2kvcm" event={"ID":"c3b0b579-6db1-463a-b391-6799f284b89a","Type":"ContainerStarted","Data":"945c8484da3f8615dc3845127d3e9fd71b9c5284ff410f87d7bf80be7c717dc3"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.513877 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz" event={"ID":"3ef6cafd-6676-4a26-9b8f-96317dda91bf","Type":"ContainerStarted","Data":"7b3b08d5b3a0006945eb6fdab5dfe50c7da0b4d048ea23003a6f92b9247e3b83"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.542577 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" event={"ID":"f69009ba-27a8-4518-b6d9-97194d8a77b3","Type":"ContainerStarted","Data":"ce61ba2d964cbaa0b0840b62be9b4c62a51e573b12c3ff0cb53c6a182cda5d4e"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.557862 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-qc728" podStartSLOduration=122.555217463 podStartE2EDuration="2m2.555217463s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:06.523664421 +0000 UTC m=+141.824571916" watchObservedRunningTime="2025-12-02 20:00:06.555217463 +0000 UTC m=+141.856124958" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.558453 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6" podStartSLOduration=121.558445938 podStartE2EDuration="2m1.558445938s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:06.554806443 +0000 UTC m=+141.855713938" watchObservedRunningTime="2025-12-02 20:00:06.558445938 +0000 UTC m=+141.859353433" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.565449 4807 generic.go:334] "Generic (PLEG): container finished" podID="b7c39998-d9b9-445b-8fa5-b338ccdfaf6f" containerID="f2403cb399644afc236c8f75ac117d71bfb0773f5af1e7de3c34196411802305" exitCode=0 Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.565578 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" event={"ID":"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f","Type":"ContainerDied","Data":"f2403cb399644afc236c8f75ac117d71bfb0773f5af1e7de3c34196411802305"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.583774 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:06 crc kubenswrapper[4807]: E1202 20:00:06.585523 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:07.085500364 +0000 UTC m=+142.386407859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.585993 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" event={"ID":"350fb173-d354-4879-9e3b-c2429a682c05","Type":"ContainerStarted","Data":"7ad40f480e19cae20f792c0a9915a0da5f86660bd34c9522811df6e79c1c2660"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.586878 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.589414 4807 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-g89wb container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.589462 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" podUID="350fb173-d354-4879-9e3b-c2429a682c05" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.595130 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-bjmsc" podStartSLOduration=122.59511085 podStartE2EDuration="2m2.59511085s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:06.593694576 +0000 UTC m=+141.894602071" watchObservedRunningTime="2025-12-02 20:00:06.59511085 +0000 UTC m=+141.896018335" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.599633 4807 generic.go:334] "Generic (PLEG): container finished" podID="5a670da2-9970-471b-93fb-7dce5fe66c94" containerID="a38de89ffc7c4db663313f26589f9342c834c93e8c5977a4edd05f370c6daadb" exitCode=0 Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.599704 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" event={"ID":"5a670da2-9970-471b-93fb-7dce5fe66c94","Type":"ContainerDied","Data":"a38de89ffc7c4db663313f26589f9342c834c93e8c5977a4edd05f370c6daadb"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.605542 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dsdtv" event={"ID":"73115f00-4586-4157-86c0-332528cea752","Type":"ContainerStarted","Data":"210a0344a8bdc0376e2ee4903836c3de3fc8ac1c6e82bf953d9747425e498222"} Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.677501 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" podStartSLOduration=121.677484365 podStartE2EDuration="2m1.677484365s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:06.631638718 +0000 UTC m=+141.932546213" watchObservedRunningTime="2025-12-02 20:00:06.677484365 +0000 UTC m=+141.978391860" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.677971 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-b9nt9" podStartSLOduration=122.677966626 podStartE2EDuration="2m2.677966626s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:06.674983376 +0000 UTC m=+141.975890871" watchObservedRunningTime="2025-12-02 20:00:06.677966626 +0000 UTC m=+141.978874121" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.685804 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:06 crc kubenswrapper[4807]: E1202 20:00:06.689018 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:07.188986695 +0000 UTC m=+142.489894190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.743494 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n7sh5" podStartSLOduration=121.743476046 podStartE2EDuration="2m1.743476046s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:06.705074373 +0000 UTC m=+142.005981868" watchObservedRunningTime="2025-12-02 20:00:06.743476046 +0000 UTC m=+142.044383541" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.778314 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" podStartSLOduration=122.778299134 podStartE2EDuration="2m2.778299134s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:06.745494853 +0000 UTC m=+142.046402338" watchObservedRunningTime="2025-12-02 20:00:06.778299134 +0000 UTC m=+142.079206629" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.779057 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" podStartSLOduration=121.779051511 podStartE2EDuration="2m1.779051511s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:06.776943462 +0000 UTC m=+142.077850957" watchObservedRunningTime="2025-12-02 20:00:06.779051511 +0000 UTC m=+142.079958996" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.786864 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:06 crc kubenswrapper[4807]: E1202 20:00:06.786999 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:07.286973378 +0000 UTC m=+142.587880873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.787172 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:06 crc kubenswrapper[4807]: E1202 20:00:06.787599 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:07.287583652 +0000 UTC m=+142.588491147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.818656 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-dsdtv" podStartSLOduration=5.818628531 podStartE2EDuration="5.818628531s" podCreationTimestamp="2025-12-02 20:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:06.818299654 +0000 UTC m=+142.119207149" watchObservedRunningTime="2025-12-02 20:00:06.818628531 +0000 UTC m=+142.119536026" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.888379 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:06 crc kubenswrapper[4807]: E1202 20:00:06.889124 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:07.389105406 +0000 UTC m=+142.690012901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.952987 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-2kvcm" podStartSLOduration=121.952963777 podStartE2EDuration="2m1.952963777s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:06.951177475 +0000 UTC m=+142.252084970" watchObservedRunningTime="2025-12-02 20:00:06.952963777 +0000 UTC m=+142.253871272" Dec 02 20:00:06 crc kubenswrapper[4807]: I1202 20:00:06.991593 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:06 crc kubenswrapper[4807]: E1202 20:00:06.991983 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:07.491970783 +0000 UTC m=+142.792878278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.093229 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:07 crc kubenswrapper[4807]: E1202 20:00:07.093823 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:07.593780545 +0000 UTC m=+142.894688040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.182404 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.193499 4807 patch_prober.go:28] interesting pod/router-default-5444994796-2kvcm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 20:00:07 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Dec 02 20:00:07 crc kubenswrapper[4807]: [+]process-running ok Dec 02 20:00:07 crc kubenswrapper[4807]: healthz check failed Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.193590 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2kvcm" podUID="c3b0b579-6db1-463a-b391-6799f284b89a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.195609 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:07 crc kubenswrapper[4807]: E1202 20:00:07.196047 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:07.696019347 +0000 UTC m=+142.996926842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.300395 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:07 crc kubenswrapper[4807]: E1202 20:00:07.302499 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:07.802461648 +0000 UTC m=+143.103369143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.325899 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.328230 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.341100 4807 patch_prober.go:28] interesting pod/apiserver-76f77b778f-p49z8 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 02 20:00:07 crc kubenswrapper[4807]: [+]log ok Dec 02 20:00:07 crc kubenswrapper[4807]: [+]etcd ok Dec 02 20:00:07 crc kubenswrapper[4807]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 02 20:00:07 crc kubenswrapper[4807]: [+]poststarthook/generic-apiserver-start-informers ok Dec 02 20:00:07 crc kubenswrapper[4807]: [+]poststarthook/max-in-flight-filter ok Dec 02 20:00:07 crc kubenswrapper[4807]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 02 20:00:07 crc kubenswrapper[4807]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 02 20:00:07 crc kubenswrapper[4807]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 02 20:00:07 crc kubenswrapper[4807]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 02 20:00:07 crc kubenswrapper[4807]: [+]poststarthook/project.openshift.io-projectcache ok Dec 02 20:00:07 crc kubenswrapper[4807]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 02 20:00:07 crc kubenswrapper[4807]: [+]poststarthook/openshift.io-startinformers ok Dec 02 20:00:07 crc kubenswrapper[4807]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 02 20:00:07 crc kubenswrapper[4807]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 02 20:00:07 crc kubenswrapper[4807]: livez check failed Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.341568 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-p49z8" podUID="b0dd45ac-1b9d-4274-96c1-98290213c989" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.388031 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.405186 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:07 crc kubenswrapper[4807]: E1202 20:00:07.408469 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:07.908449698 +0000 UTC m=+143.209357193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.515226 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:07 crc kubenswrapper[4807]: E1202 20:00:07.517973 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:08.017954521 +0000 UTC m=+143.318862016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.620674 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:07 crc kubenswrapper[4807]: E1202 20:00:07.621401 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:08.121388561 +0000 UTC m=+143.422296046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.640186 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz" event={"ID":"3ef6cafd-6676-4a26-9b8f-96317dda91bf","Type":"ContainerStarted","Data":"2d6585c5ddac2f49d7eca00ff61781a25e844234d6ed80ab374dae04921a1be6"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.640241 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz" event={"ID":"3ef6cafd-6676-4a26-9b8f-96317dda91bf","Type":"ContainerStarted","Data":"d2d0b3e002fd4a6ee0b5851588eef78189b42065387a32ed659e955fb1e23dd8"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.641380 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz" Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.659052 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xgm55" event={"ID":"55c432b0-edb0-4988-994e-0a888a54621a","Type":"ContainerStarted","Data":"8a09b7d7b083805f026e461fe4c42ceb7b70485e7b1a9597b953606aab6a7fc8"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.687687 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-pzlxq" event={"ID":"30a68e73-24db-4b31-a2d3-7d25eb88f0ff","Type":"ContainerStarted","Data":"66f7c90e9f8ad63809f5ee4b63b5af7eff2c289343c90ddd0f62838beee53250"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.696068 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" event={"ID":"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c","Type":"ContainerStarted","Data":"b3f53f284cf96205f4427e5b70fc2bc074e2ba7c47dfd741d752f28115a2d3bb"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.697283 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.704497 4807 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2c9sc container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.704557 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" podUID="b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.721975 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:07 crc kubenswrapper[4807]: E1202 20:00:07.722557 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:08.222535148 +0000 UTC m=+143.523442643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.728323 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9" event={"ID":"ae77852d-9558-4ceb-9eef-1b65bb912a92","Type":"ContainerStarted","Data":"035ecbfc80021ab32df09ac8bae1a972c2b26844fc827287554afaa75970a955"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.736411 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ssjb6" event={"ID":"744130db-de38-4c78-9684-95c04e411397","Type":"ContainerStarted","Data":"068cf20d6e116039426c344fbdf5b95a1d223123fee9300438bdd1f4ec98d55e"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.737641 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" event={"ID":"d66ba69f-89dd-49ea-91f6-6682ce7bdc8a","Type":"ContainerStarted","Data":"081b0a8e00ea83fc581ecf985163615fed07b6224449dafe786750dd632b49b9"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.738335 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" event={"ID":"0555dc0e-5155-49f2-8c17-cfd091afacf9","Type":"ContainerStarted","Data":"9abe9d9e87e78c4097cf29fd149f37a54d1c9106a834bf0ed6148f29bae94d54"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.751527 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz" podStartSLOduration=122.751509378 podStartE2EDuration="2m2.751509378s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:07.679105487 +0000 UTC m=+142.980012982" watchObservedRunningTime="2025-12-02 20:00:07.751509378 +0000 UTC m=+143.052416863" Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.752310 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" podStartSLOduration=122.752302357 podStartE2EDuration="2m2.752302357s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:07.750340481 +0000 UTC m=+143.051247976" watchObservedRunningTime="2025-12-02 20:00:07.752302357 +0000 UTC m=+143.053209842" Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.752418 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c4hb8" event={"ID":"3a3480f6-d49a-4f37-9f9a-605d9efc851e","Type":"ContainerStarted","Data":"e8f6bfc40e9e6da75b070ee76f7f08a2e023c08315bc4bb840a79d12bb2a1fb6"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.754172 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cbpn8" event={"ID":"3d3a8832-4745-4b77-b855-edea43d079d4","Type":"ContainerStarted","Data":"6d3b7ae5a49e157160fb263e108d0da7878774bded5493cd4bf51b6b1106ff45"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.756022 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kq6q9" event={"ID":"f69009ba-27a8-4518-b6d9-97194d8a77b3","Type":"ContainerStarted","Data":"70844fea0fad7e79e5ec8fee89c2f06eae123cd7ea762d96c972bcbfc0770b36"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.786305 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" event={"ID":"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df","Type":"ContainerStarted","Data":"20d9e4fec45b544a7eb104e0a7283d3b9c6fa980d49f6f17db635e5608d6e64d"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.817328 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" event={"ID":"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c","Type":"ContainerStarted","Data":"6f760c1ddf090a903a26fa6c015461a7865ba8a29769cd54672dd46ca0547a9b"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.826770 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:07 crc kubenswrapper[4807]: E1202 20:00:07.827288 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:08.327269658 +0000 UTC m=+143.628177143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.845554 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v476v" event={"ID":"11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d","Type":"ContainerStarted","Data":"b667b3eb2d6a1fa677efad35f482b1d9fe75f03ec418ee3235f8d83cc8b39750"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.890452 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" event={"ID":"30d55e05-d66f-496a-b4e1-fb6deb38895f","Type":"ContainerStarted","Data":"2f492293b01fc826df1b80ef2ebad2fa6f0a0f7a9d2fb144970d08b059c6bd83"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.890517 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" event={"ID":"30d55e05-d66f-496a-b4e1-fb6deb38895f","Type":"ContainerStarted","Data":"238b7757227bf360cacfa4d9fdaeaf0cbc820e629e53dab7aac55c0ce33c7899"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.890957 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.894919 4807 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-v4lp2 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.22:6443/healthz\": dial tcp 10.217.0.22:6443: connect: connection refused" start-of-body= Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.895357 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.22:6443/healthz\": dial tcp 10.217.0.22:6443: connect: connection refused" Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.928480 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:07 crc kubenswrapper[4807]: E1202 20:00:07.930054 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:08.430032093 +0000 UTC m=+143.730939588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.941513 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lmjwv" event={"ID":"f1b36133-0f4e-49bf-b33d-224112b2a964","Type":"ContainerStarted","Data":"f70a5c0510cb4fbddabc68d993e0fa97e46d2272aadb0c1027ed412575aea31a"} Dec 02 20:00:07 crc kubenswrapper[4807]: I1202 20:00:07.973484 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-n7sh5" event={"ID":"617adf8b-8ca9-4578-8c24-8f6b22713567","Type":"ContainerStarted","Data":"63d6dd3e66612fc3628c17454fd378fb6505662011513978535162903c79eac6"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.004066 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm" event={"ID":"623a789c-ee08-43de-b81c-8f1499ab8fbc","Type":"ContainerStarted","Data":"7e5ea57d1fd77b64e757a87b079173413181aa638f7f0c07f94829c1d4f8067f"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.017501 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-b46dw" event={"ID":"58f0bd92-2b1f-4d7e-899f-556bbf8cdf00","Type":"ContainerStarted","Data":"3f95166591db05c6ea65c232527bb72f39f173b14d4f7282cdb009e414f7618b"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.017834 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-b46dw" event={"ID":"58f0bd92-2b1f-4d7e-899f-556bbf8cdf00","Type":"ContainerStarted","Data":"cffe689dbbf2b81aa30fdf8e1cd504546f8a6cdac38453b9f07a5e3391a8fbd8"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.019095 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-b46dw" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.021614 4807 patch_prober.go:28] interesting pod/console-operator-58897d9998-b46dw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/readyz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.021665 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-b46dw" podUID="58f0bd92-2b1f-4d7e-899f-556bbf8cdf00" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/readyz\": dial tcp 10.217.0.34:8443: connect: connection refused" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.022816 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" event={"ID":"3bc7f08d-6984-4bab-9220-761b68fdec0d","Type":"ContainerStarted","Data":"1d0bedb05fdcad8ad84c571471aa13395eb036822ff5a066e63600cd3240a83c"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.022840 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" event={"ID":"3bc7f08d-6984-4bab-9220-761b68fdec0d","Type":"ContainerStarted","Data":"4ac6d4667f3a27ba561f5f1687bfc97676bd0cc41edf2c332857c2aae9441326"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.034237 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:08 crc kubenswrapper[4807]: E1202 20:00:08.036556 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:08.536535015 +0000 UTC m=+143.837442500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.048979 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr" event={"ID":"96d50594-2379-4758-9882-7328d7bdf1fb","Type":"ContainerStarted","Data":"c01623538f53dc49e464acf286d39cb9c61a88f2b2bed6abb2cc12e85acdd85a"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.060967 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" podStartSLOduration=124.060950499 podStartE2EDuration="2m4.060950499s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:07.931219491 +0000 UTC m=+143.232126986" watchObservedRunningTime="2025-12-02 20:00:08.060950499 +0000 UTC m=+143.361857994" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.064464 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb" event={"ID":"0622ab0a-2b5b-4f71-a9b1-8573297d6351","Type":"ContainerStarted","Data":"ec86936b5a923c1e877aaae79738030169e3d1c20a04fdfd9d8c510ccea1343e"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.079002 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" event={"ID":"3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8","Type":"ContainerStarted","Data":"fe4236d4ef3ea49f64c8e8f987462d44597963e8840d0e550a1f110834d67fa4"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.079067 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" event={"ID":"3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8","Type":"ContainerStarted","Data":"a0ed2dfe8b60fec3c94db7f11b4e3cfe24f59febcbfaecbf635ebe9106628e78"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.094958 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-b8g8z" event={"ID":"5dcbd369-4c3b-4d52-b93f-a2d2e1947163","Type":"ContainerStarted","Data":"03f1f2d5967946cca95f62934a088fc3b130f03d56d7275efa51d8131a57fd8c"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.103760 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-b46dw" podStartSLOduration=124.103732654 podStartE2EDuration="2m4.103732654s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:08.061222645 +0000 UTC m=+143.362130140" watchObservedRunningTime="2025-12-02 20:00:08.103732654 +0000 UTC m=+143.404640149" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.108538 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" event={"ID":"947012be-68b2-4b9a-b020-258b9b1f0ce8","Type":"ContainerStarted","Data":"5daeec6955e8bd4c137995f18a10fa4cfba8382d55951bb287643821c33639e7"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.108592 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" event={"ID":"947012be-68b2-4b9a-b020-258b9b1f0ce8","Type":"ContainerStarted","Data":"30a25914ddf0ced78a0cceb2d7c61382a80d6c233bcf434dd481fc51347c628e"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.119876 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" event={"ID":"350fb173-d354-4879-9e3b-c2429a682c05","Type":"ContainerStarted","Data":"e4a02df01b489f4c6e68a9de172377ea116a92121c0ef1664ebaeb2be362a49c"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.134046 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" event={"ID":"63a3212f-beb2-4bc0-b88a-947b3a7653c2","Type":"ContainerStarted","Data":"45233bac3cf1d48a20b05af3353c01d2b6bceb8f83a184422a2aefc84f24f7ba"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.134096 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" event={"ID":"63a3212f-beb2-4bc0-b88a-947b3a7653c2","Type":"ContainerStarted","Data":"a0a6912a215b7071a7c9e5d170d084661633c085556b0be35d800265319059df"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.136459 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-g89wb" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.138195 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:08 crc kubenswrapper[4807]: E1202 20:00:08.138462 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:08.638431379 +0000 UTC m=+143.939338874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.138674 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:08 crc kubenswrapper[4807]: E1202 20:00:08.146299 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:08.646276664 +0000 UTC m=+143.947184159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.147886 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr" podStartSLOduration=123.147864621 podStartE2EDuration="2m3.147864621s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:08.138816078 +0000 UTC m=+143.439723573" watchObservedRunningTime="2025-12-02 20:00:08.147864621 +0000 UTC m=+143.448772116" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.149124 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" podStartSLOduration=8.1491143 podStartE2EDuration="8.1491143s" podCreationTimestamp="2025-12-02 20:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:08.101102452 +0000 UTC m=+143.402009947" watchObservedRunningTime="2025-12-02 20:00:08.1491143 +0000 UTC m=+143.450021795" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.149129 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-49lk4" event={"ID":"631034d3-e8b2-42a8-9566-8f8922464b56","Type":"ContainerStarted","Data":"63fa42a67d5018bd1f5f20323826934c4394e086c8d557bd3358874dc54704f9"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.150127 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-49lk4" event={"ID":"631034d3-e8b2-42a8-9566-8f8922464b56","Type":"ContainerStarted","Data":"a384355a0d6bc8969d0f8e8ceecdad7576b8b42f43fae288096f0cfc7a01a674"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.167743 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94" event={"ID":"bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5","Type":"ContainerStarted","Data":"0703774c8f2dc18157e9104c286d1a861d8bfbe3c2edfc5d50c82af546628fc4"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.167890 4807 patch_prober.go:28] interesting pod/downloads-7954f5f757-frmzm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.169451 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-frmzm" podUID="b5784f10-10dd-4363-a50a-60d37b9c9ec5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.170249 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94" event={"ID":"bfb23a2d-392b-44ff-8b0e-2cbe7749cbc5","Type":"ContainerStarted","Data":"f9699aa309dc509a5f03c17b7f79450b0bf27e76561d227b2c91cb18a72d53a7"} Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.194463 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zm2nv" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.200413 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.208000 4807 patch_prober.go:28] interesting pod/router-default-5444994796-2kvcm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 20:00:08 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Dec 02 20:00:08 crc kubenswrapper[4807]: [+]process-running ok Dec 02 20:00:08 crc kubenswrapper[4807]: healthz check failed Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.208100 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2kvcm" podUID="c3b0b579-6db1-463a-b391-6799f284b89a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.209412 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-x9bfb" podStartSLOduration=123.209363306 podStartE2EDuration="2m3.209363306s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:08.191529157 +0000 UTC m=+143.492436652" watchObservedRunningTime="2025-12-02 20:00:08.209363306 +0000 UTC m=+143.510270811" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.240540 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:08 crc kubenswrapper[4807]: E1202 20:00:08.242166 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:08.742141286 +0000 UTC m=+144.043048781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.284130 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-n9g5h" podStartSLOduration=123.284112112 podStartE2EDuration="2m3.284112112s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:08.282440583 +0000 UTC m=+143.583348078" watchObservedRunningTime="2025-12-02 20:00:08.284112112 +0000 UTC m=+143.585019607" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.343253 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:08 crc kubenswrapper[4807]: E1202 20:00:08.348473 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:08.848454884 +0000 UTC m=+144.149362379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.366378 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kmk94" podStartSLOduration=123.366361574 podStartE2EDuration="2m3.366361574s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:08.317355293 +0000 UTC m=+143.618262778" watchObservedRunningTime="2025-12-02 20:00:08.366361574 +0000 UTC m=+143.667269069" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.422970 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-49lk4" podStartSLOduration=124.422949814 podStartE2EDuration="2m4.422949814s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:08.419185056 +0000 UTC m=+143.720092541" watchObservedRunningTime="2025-12-02 20:00:08.422949814 +0000 UTC m=+143.723857309" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.444258 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:08 crc kubenswrapper[4807]: E1202 20:00:08.444492 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:08.944457489 +0000 UTC m=+144.245364984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.445229 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:08 crc kubenswrapper[4807]: E1202 20:00:08.445652 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:08.945643407 +0000 UTC m=+144.246550902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.553670 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:08 crc kubenswrapper[4807]: E1202 20:00:08.554040 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:09.054023544 +0000 UTC m=+144.354931039 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.657735 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:08 crc kubenswrapper[4807]: E1202 20:00:08.658676 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:09.158658752 +0000 UTC m=+144.459566247 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.670786 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dc74n"] Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.671914 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.702237 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.732943 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dc74n"] Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.766654 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.766935 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjk9k\" (UniqueName: \"kubernetes.io/projected/095ad74d-f5f1-44f3-9007-c779f4f06f62-kube-api-access-wjk9k\") pod \"certified-operators-dc74n\" (UID: \"095ad74d-f5f1-44f3-9007-c779f4f06f62\") " pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.766992 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/095ad74d-f5f1-44f3-9007-c779f4f06f62-catalog-content\") pod \"certified-operators-dc74n\" (UID: \"095ad74d-f5f1-44f3-9007-c779f4f06f62\") " pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.767096 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/095ad74d-f5f1-44f3-9007-c779f4f06f62-utilities\") pod \"certified-operators-dc74n\" (UID: \"095ad74d-f5f1-44f3-9007-c779f4f06f62\") " pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:00:08 crc kubenswrapper[4807]: E1202 20:00:08.767260 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:09.267237583 +0000 UTC m=+144.568145078 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.828585 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lmdbd"] Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.830314 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.833425 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.849082 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lmdbd"] Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.870765 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/095ad74d-f5f1-44f3-9007-c779f4f06f62-catalog-content\") pod \"certified-operators-dc74n\" (UID: \"095ad74d-f5f1-44f3-9007-c779f4f06f62\") " pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.870834 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.870912 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/095ad74d-f5f1-44f3-9007-c779f4f06f62-utilities\") pod \"certified-operators-dc74n\" (UID: \"095ad74d-f5f1-44f3-9007-c779f4f06f62\") " pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.870936 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjk9k\" (UniqueName: \"kubernetes.io/projected/095ad74d-f5f1-44f3-9007-c779f4f06f62-kube-api-access-wjk9k\") pod \"certified-operators-dc74n\" (UID: \"095ad74d-f5f1-44f3-9007-c779f4f06f62\") " pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.871858 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/095ad74d-f5f1-44f3-9007-c779f4f06f62-catalog-content\") pod \"certified-operators-dc74n\" (UID: \"095ad74d-f5f1-44f3-9007-c779f4f06f62\") " pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:00:08 crc kubenswrapper[4807]: E1202 20:00:08.872145 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:09.372132178 +0000 UTC m=+144.673039673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.872203 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/095ad74d-f5f1-44f3-9007-c779f4f06f62-utilities\") pod \"certified-operators-dc74n\" (UID: \"095ad74d-f5f1-44f3-9007-c779f4f06f62\") " pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.918804 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjk9k\" (UniqueName: \"kubernetes.io/projected/095ad74d-f5f1-44f3-9007-c779f4f06f62-kube-api-access-wjk9k\") pod \"certified-operators-dc74n\" (UID: \"095ad74d-f5f1-44f3-9007-c779f4f06f62\") " pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.972098 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.972368 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkbcd\" (UniqueName: \"kubernetes.io/projected/21184a59-8520-4dd0-b459-a056b42e852d-kube-api-access-jkbcd\") pod \"community-operators-lmdbd\" (UID: \"21184a59-8520-4dd0-b459-a056b42e852d\") " pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.972430 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21184a59-8520-4dd0-b459-a056b42e852d-catalog-content\") pod \"community-operators-lmdbd\" (UID: \"21184a59-8520-4dd0-b459-a056b42e852d\") " pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:00:08 crc kubenswrapper[4807]: I1202 20:00:08.972471 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21184a59-8520-4dd0-b459-a056b42e852d-utilities\") pod \"community-operators-lmdbd\" (UID: \"21184a59-8520-4dd0-b459-a056b42e852d\") " pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:00:08 crc kubenswrapper[4807]: E1202 20:00:08.972743 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:09.47269334 +0000 UTC m=+144.773600835 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.014222 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hrpb5"] Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.015907 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.047410 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hrpb5"] Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.077699 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.077804 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkbcd\" (UniqueName: \"kubernetes.io/projected/21184a59-8520-4dd0-b459-a056b42e852d-kube-api-access-jkbcd\") pod \"community-operators-lmdbd\" (UID: \"21184a59-8520-4dd0-b459-a056b42e852d\") " pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.077841 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21184a59-8520-4dd0-b459-a056b42e852d-catalog-content\") pod \"community-operators-lmdbd\" (UID: \"21184a59-8520-4dd0-b459-a056b42e852d\") " pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.077873 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21184a59-8520-4dd0-b459-a056b42e852d-utilities\") pod \"community-operators-lmdbd\" (UID: \"21184a59-8520-4dd0-b459-a056b42e852d\") " pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.078420 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21184a59-8520-4dd0-b459-a056b42e852d-utilities\") pod \"community-operators-lmdbd\" (UID: \"21184a59-8520-4dd0-b459-a056b42e852d\") " pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:00:09 crc kubenswrapper[4807]: E1202 20:00:09.078747 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:09.578733132 +0000 UTC m=+144.879640627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.079485 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21184a59-8520-4dd0-b459-a056b42e852d-catalog-content\") pod \"community-operators-lmdbd\" (UID: \"21184a59-8520-4dd0-b459-a056b42e852d\") " pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.094148 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.138057 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkbcd\" (UniqueName: \"kubernetes.io/projected/21184a59-8520-4dd0-b459-a056b42e852d-kube-api-access-jkbcd\") pod \"community-operators-lmdbd\" (UID: \"21184a59-8520-4dd0-b459-a056b42e852d\") " pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.184400 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.185430 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.185961 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/807e5ccc-6b8a-4be7-8b8c-34888237e22d-utilities\") pod \"certified-operators-hrpb5\" (UID: \"807e5ccc-6b8a-4be7-8b8c-34888237e22d\") " pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.186117 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz2sr\" (UniqueName: \"kubernetes.io/projected/807e5ccc-6b8a-4be7-8b8c-34888237e22d-kube-api-access-mz2sr\") pod \"certified-operators-hrpb5\" (UID: \"807e5ccc-6b8a-4be7-8b8c-34888237e22d\") " pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.186310 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/807e5ccc-6b8a-4be7-8b8c-34888237e22d-catalog-content\") pod \"certified-operators-hrpb5\" (UID: \"807e5ccc-6b8a-4be7-8b8c-34888237e22d\") " pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:00:09 crc kubenswrapper[4807]: E1202 20:00:09.186569 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:09.686549135 +0000 UTC m=+144.987456630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.191022 4807 patch_prober.go:28] interesting pod/router-default-5444994796-2kvcm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 20:00:09 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Dec 02 20:00:09 crc kubenswrapper[4807]: [+]process-running ok Dec 02 20:00:09 crc kubenswrapper[4807]: healthz check failed Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.191110 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2kvcm" podUID="c3b0b579-6db1-463a-b391-6799f284b89a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.213775 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ncdvx"] Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.216409 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.222269 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" event={"ID":"0555dc0e-5155-49f2-8c17-cfd091afacf9","Type":"ContainerStarted","Data":"d450a4ed6a9d536ec79a4bd9ec38492419bcce935e732e61b008318a1833ad1c"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.222326 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" event={"ID":"0555dc0e-5155-49f2-8c17-cfd091afacf9","Type":"ContainerStarted","Data":"1858476d68bdf0dbd4f06ea2b2b53edbfe01ed45a6a65f801b19f326b5c2b9fe"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.271356 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ncdvx"] Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.290746 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/807e5ccc-6b8a-4be7-8b8c-34888237e22d-utilities\") pod \"certified-operators-hrpb5\" (UID: \"807e5ccc-6b8a-4be7-8b8c-34888237e22d\") " pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.290799 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz2sr\" (UniqueName: \"kubernetes.io/projected/807e5ccc-6b8a-4be7-8b8c-34888237e22d-kube-api-access-mz2sr\") pod \"certified-operators-hrpb5\" (UID: \"807e5ccc-6b8a-4be7-8b8c-34888237e22d\") " pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.290845 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.290877 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/807e5ccc-6b8a-4be7-8b8c-34888237e22d-catalog-content\") pod \"certified-operators-hrpb5\" (UID: \"807e5ccc-6b8a-4be7-8b8c-34888237e22d\") " pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.291463 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/807e5ccc-6b8a-4be7-8b8c-34888237e22d-utilities\") pod \"certified-operators-hrpb5\" (UID: \"807e5ccc-6b8a-4be7-8b8c-34888237e22d\") " pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.296303 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/807e5ccc-6b8a-4be7-8b8c-34888237e22d-catalog-content\") pod \"certified-operators-hrpb5\" (UID: \"807e5ccc-6b8a-4be7-8b8c-34888237e22d\") " pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:00:09 crc kubenswrapper[4807]: E1202 20:00:09.318827 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:09.818799362 +0000 UTC m=+145.119706847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.319214 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cbpn8" event={"ID":"3d3a8832-4745-4b77-b855-edea43d079d4","Type":"ContainerStarted","Data":"5ec361d494d2d9acb69e26786ee5d051dd52c4a125d4a02a55fe1357be8c434c"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.348262 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz2sr\" (UniqueName: \"kubernetes.io/projected/807e5ccc-6b8a-4be7-8b8c-34888237e22d-kube-api-access-mz2sr\") pod \"certified-operators-hrpb5\" (UID: \"807e5ccc-6b8a-4be7-8b8c-34888237e22d\") " pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.348814 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.355582 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xgm55" event={"ID":"55c432b0-edb0-4988-994e-0a888a54621a","Type":"ContainerStarted","Data":"e8ef1c2984e52e1501928567dcb66ae1f7443cf84c3be564e03a2be0015e3819"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.369869 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" event={"ID":"98cb9822-a3b6-48a8-93ce-1b619a9a8e7c","Type":"ContainerStarted","Data":"fff0a451d8ff4dbe289409d72b82240f4b61ee92e8ad185123ce23f8e09e31b3"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.391655 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.391916 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-catalog-content\") pod \"community-operators-ncdvx\" (UID: \"3d555627-e360-46ad-b6d9-ec1cc3ce1d68\") " pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.391971 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-utilities\") pod \"community-operators-ncdvx\" (UID: \"3d555627-e360-46ad-b6d9-ec1cc3ce1d68\") " pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.391990 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95trv\" (UniqueName: \"kubernetes.io/projected/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-kube-api-access-95trv\") pod \"community-operators-ncdvx\" (UID: \"3d555627-e360-46ad-b6d9-ec1cc3ce1d68\") " pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.404474 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-69994" podStartSLOduration=124.404456375 podStartE2EDuration="2m4.404456375s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:09.358598047 +0000 UTC m=+144.659505542" watchObservedRunningTime="2025-12-02 20:00:09.404456375 +0000 UTC m=+144.705363870" Dec 02 20:00:09 crc kubenswrapper[4807]: E1202 20:00:09.405438 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:09.905423197 +0000 UTC m=+145.206330692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.447940 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xgm55" podStartSLOduration=124.447922566 podStartE2EDuration="2m4.447922566s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:09.405172381 +0000 UTC m=+144.706079876" watchObservedRunningTime="2025-12-02 20:00:09.447922566 +0000 UTC m=+144.748830061" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.453976 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" event={"ID":"5a670da2-9970-471b-93fb-7dce5fe66c94","Type":"ContainerStarted","Data":"c4f670f44ac18dbffddc082ab85dbe8e2240a8edf5c08f571ab764fa1467a3fe"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.454928 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.467080 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm" event={"ID":"623a789c-ee08-43de-b81c-8f1499ab8fbc","Type":"ContainerStarted","Data":"ec9061ace2d8d42296e065c1a10bd10d03fef19b2163461fadf63739a1d2fb0f"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.474027 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" event={"ID":"b7c39998-d9b9-445b-8fa5-b338ccdfaf6f","Type":"ContainerStarted","Data":"aac46f1038634704bcd20a269e5f4db27c4b52d435d811daade1065eacaa9368"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.476306 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-7727z" podStartSLOduration=125.476290922 podStartE2EDuration="2m5.476290922s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:09.449206606 +0000 UTC m=+144.750114091" watchObservedRunningTime="2025-12-02 20:00:09.476290922 +0000 UTC m=+144.777198417" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.476609 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tt5pr" event={"ID":"96d50594-2379-4758-9882-7328d7bdf1fb","Type":"ContainerStarted","Data":"9a8b28dfae1750713f961ba475df20c7638af0fd6e93d3176df770905ef189eb"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.478033 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" podStartSLOduration=125.478027373 podStartE2EDuration="2m5.478027373s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:09.476231511 +0000 UTC m=+144.777139016" watchObservedRunningTime="2025-12-02 20:00:09.478027373 +0000 UTC m=+144.778934868" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.494915 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-utilities\") pod \"community-operators-ncdvx\" (UID: \"3d555627-e360-46ad-b6d9-ec1cc3ce1d68\") " pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.494960 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95trv\" (UniqueName: \"kubernetes.io/projected/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-kube-api-access-95trv\") pod \"community-operators-ncdvx\" (UID: \"3d555627-e360-46ad-b6d9-ec1cc3ce1d68\") " pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.495061 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.495206 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-catalog-content\") pod \"community-operators-ncdvx\" (UID: \"3d555627-e360-46ad-b6d9-ec1cc3ce1d68\") " pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:00:09 crc kubenswrapper[4807]: E1202 20:00:09.496164 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:09.996145249 +0000 UTC m=+145.297052744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.496805 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-utilities\") pod \"community-operators-ncdvx\" (UID: \"3d555627-e360-46ad-b6d9-ec1cc3ce1d68\") " pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.499774 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-catalog-content\") pod \"community-operators-ncdvx\" (UID: \"3d555627-e360-46ad-b6d9-ec1cc3ce1d68\") " pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.515264 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" podStartSLOduration=124.515250508 podStartE2EDuration="2m4.515250508s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:09.512842511 +0000 UTC m=+144.813750006" watchObservedRunningTime="2025-12-02 20:00:09.515250508 +0000 UTC m=+144.816158003" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.531421 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v476v" event={"ID":"11e7d49d-f3e9-4b30-bb7e-3c90942d3c0d","Type":"ContainerStarted","Data":"8f281d9502e3c3b35b6d3e5fb515a286b4081f38e235ee22e2182154bb47b685"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.531482 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95trv\" (UniqueName: \"kubernetes.io/projected/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-kube-api-access-95trv\") pod \"community-operators-ncdvx\" (UID: \"3d555627-e360-46ad-b6d9-ec1cc3ce1d68\") " pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.554065 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.578603 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-89tmm" podStartSLOduration=124.578585176 podStartE2EDuration="2m4.578585176s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:09.544938915 +0000 UTC m=+144.845846410" watchObservedRunningTime="2025-12-02 20:00:09.578585176 +0000 UTC m=+144.879492671" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.593153 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" event={"ID":"947012be-68b2-4b9a-b020-258b9b1f0ce8","Type":"ContainerStarted","Data":"2d115c4c98af9d6c413c8cb3873295cba299b0aa91be4fe9551c67e8b0925799"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.603796 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:09 crc kubenswrapper[4807]: E1202 20:00:09.605025 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:10.105009047 +0000 UTC m=+145.405916542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.630915 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-b8g8z" event={"ID":"5dcbd369-4c3b-4d52-b93f-a2d2e1947163","Type":"ContainerStarted","Data":"ad9bf0b9e7d67187e6d92c1917380e1e9fcb493d7f0e519ef4d687d8b2558432"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.630967 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-b8g8z" event={"ID":"5dcbd369-4c3b-4d52-b93f-a2d2e1947163","Type":"ContainerStarted","Data":"fef0c9df0c6b8f90c561b6a60fbebae455c725b488f145940e2ff7d755395894"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.631283 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-b8g8z" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.636140 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-v476v" podStartSLOduration=8.636081167 podStartE2EDuration="8.636081167s" podCreationTimestamp="2025-12-02 20:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:09.588947999 +0000 UTC m=+144.889855494" watchObservedRunningTime="2025-12-02 20:00:09.636081167 +0000 UTC m=+144.936988672" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.639381 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w42b5" podStartSLOduration=124.639367904 podStartE2EDuration="2m4.639367904s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:09.636520257 +0000 UTC m=+144.937427752" watchObservedRunningTime="2025-12-02 20:00:09.639367904 +0000 UTC m=+144.940275399" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.667871 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" event={"ID":"3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8","Type":"ContainerStarted","Data":"d991dd6be7450cb9fdcab511f642fe558de42f984c54a55a39ee4062393983b5"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.671006 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-pzlxq" event={"ID":"30a68e73-24db-4b31-a2d3-7d25eb88f0ff","Type":"ContainerStarted","Data":"793a43459f7f8e982c7d16f30d4a4b85292607ce73f0f0b97210bf61c3515787"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.672470 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9" event={"ID":"ae77852d-9558-4ceb-9eef-1b65bb912a92","Type":"ContainerStarted","Data":"52f62cb867147aa5ef54fb57d32da3d204bca6bec42d8499e196fdf72dea9012"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.673527 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-b8g8z" podStartSLOduration=8.673515886 podStartE2EDuration="8.673515886s" podCreationTimestamp="2025-12-02 20:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:09.669232756 +0000 UTC m=+144.970140241" watchObservedRunningTime="2025-12-02 20:00:09.673515886 +0000 UTC m=+144.974423381" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.710087 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:09 crc kubenswrapper[4807]: E1202 20:00:09.712273 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:10.212249276 +0000 UTC m=+145.513156771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.715502 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c4hb8" event={"ID":"3a3480f6-d49a-4f37-9f9a-605d9efc851e","Type":"ContainerStarted","Data":"0657c67adf1923772cfb0ffb6e7eb651de2235d86d7feccde869ed05c5d79dbe"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.732850 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-pzlxq" podStartSLOduration=124.73283001 podStartE2EDuration="2m4.73283001s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:09.732196425 +0000 UTC m=+145.033103920" watchObservedRunningTime="2025-12-02 20:00:09.73283001 +0000 UTC m=+145.033737505" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.734186 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7c8mj" podStartSLOduration=124.734177621 podStartE2EDuration="2m4.734177621s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:09.704530695 +0000 UTC m=+145.005438190" watchObservedRunningTime="2025-12-02 20:00:09.734177621 +0000 UTC m=+145.035085116" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.747053 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lmjwv" event={"ID":"f1b36133-0f4e-49bf-b33d-224112b2a964","Type":"ContainerStarted","Data":"7638f1e871075b5c214eb3efd60407f657476dc592f4fb715ada00ba3ed88632"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.747110 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lmjwv" event={"ID":"f1b36133-0f4e-49bf-b33d-224112b2a964","Type":"ContainerStarted","Data":"b6cbc795bc61a097c3ccc7390fba72575862d3cff164f136f8ad2763ac5c1487"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.772308 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" event={"ID":"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df","Type":"ContainerStarted","Data":"9ce20a6dcb15e780d26a4f47aeb7eb707edd9bd280fb7a08fe3eae3cfb143919"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.791010 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" event={"ID":"d66ba69f-89dd-49ea-91f6-6682ce7bdc8a","Type":"ContainerStarted","Data":"55ccfcb76a45fbc38b27af4ecec65d39a8bdf2bb962c19fb4a75f727d0ae53e0"} Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.791286 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.800287 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-k6bt9" podStartSLOduration=125.800248114 podStartE2EDuration="2m5.800248114s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:09.768697672 +0000 UTC m=+145.069605177" watchObservedRunningTime="2025-12-02 20:00:09.800248114 +0000 UTC m=+145.101155609" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.801145 4807 patch_prober.go:28] interesting pod/downloads-7954f5f757-frmzm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.801189 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-frmzm" podUID="b5784f10-10dd-4363-a50a-60d37b9c9ec5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.802926 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-lmjwv" podStartSLOduration=124.802912896 podStartE2EDuration="2m4.802912896s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:09.801627686 +0000 UTC m=+145.102535181" watchObservedRunningTime="2025-12-02 20:00:09.802912896 +0000 UTC m=+145.103820391" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.818324 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.818574 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-b46dw" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.819173 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.819525 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:00:09 crc kubenswrapper[4807]: E1202 20:00:09.819620 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:10.319601508 +0000 UTC m=+145.620509003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.855967 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-c4hb8" podStartSLOduration=124.855949032 podStartE2EDuration="2m4.855949032s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:09.853772441 +0000 UTC m=+145.154679946" watchObservedRunningTime="2025-12-02 20:00:09.855949032 +0000 UTC m=+145.156856577" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.931100 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:09 crc kubenswrapper[4807]: I1202 20:00:09.944971 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lmdbd"] Dec 02 20:00:09 crc kubenswrapper[4807]: E1202 20:00:09.948755 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:10.448728002 +0000 UTC m=+145.749635497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.039308 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:10 crc kubenswrapper[4807]: E1202 20:00:10.039775 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:10.539759741 +0000 UTC m=+145.840667236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.066703 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" podStartSLOduration=125.066664803 podStartE2EDuration="2m5.066664803s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:10.019774811 +0000 UTC m=+145.320682316" watchObservedRunningTime="2025-12-02 20:00:10.066664803 +0000 UTC m=+145.367572298" Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.127819 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dc74n"] Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.140760 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:10 crc kubenswrapper[4807]: E1202 20:00:10.141137 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:10.641118173 +0000 UTC m=+145.942025668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.187703 4807 patch_prober.go:28] interesting pod/router-default-5444994796-2kvcm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 20:00:10 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Dec 02 20:00:10 crc kubenswrapper[4807]: [+]process-running ok Dec 02 20:00:10 crc kubenswrapper[4807]: healthz check failed Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.188166 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2kvcm" podUID="c3b0b579-6db1-463a-b391-6799f284b89a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.245859 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:10 crc kubenswrapper[4807]: E1202 20:00:10.246298 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:10.746277743 +0000 UTC m=+146.047185238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.284031 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8sxvh" Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.319773 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hrpb5"] Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.350582 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:10 crc kubenswrapper[4807]: E1202 20:00:10.351090 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:10.851074425 +0000 UTC m=+146.151981920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.453331 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:10 crc kubenswrapper[4807]: E1202 20:00:10.454048 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:10.954029184 +0000 UTC m=+146.254936679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.512891 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ncdvx"] Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.558558 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:10 crc kubenswrapper[4807]: E1202 20:00:10.559115 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:11.059103522 +0000 UTC m=+146.360011017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.670122 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:10 crc kubenswrapper[4807]: E1202 20:00:10.670476 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:11.170442628 +0000 UTC m=+146.471350123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.670615 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:10 crc kubenswrapper[4807]: E1202 20:00:10.671053 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:11.171045352 +0000 UTC m=+146.471952847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.771653 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:10 crc kubenswrapper[4807]: E1202 20:00:10.772335 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:11.272304001 +0000 UTC m=+146.573211496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.772398 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:10 crc kubenswrapper[4807]: E1202 20:00:10.772784 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:11.272773462 +0000 UTC m=+146.573681137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.805679 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x9z86"] Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.807030 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.812708 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.828454 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x9z86"] Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.831922 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc74n" event={"ID":"095ad74d-f5f1-44f3-9007-c779f4f06f62","Type":"ContainerStarted","Data":"f8103470c6692017c6e127fc486d5130985f6e37c306bf11625fd3d4a3626348"} Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.832160 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc74n" event={"ID":"095ad74d-f5f1-44f3-9007-c779f4f06f62","Type":"ContainerStarted","Data":"0ce1fd7e3b3af0c81c9c535717cd9de387d979b6f24c942370e998b792da5e68"} Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.850363 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.862796 4807 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.865499 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" event={"ID":"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df","Type":"ContainerStarted","Data":"167ce179830ec708d16868d9356dced4f3f5bc6ad7959b546feefbfe96724ba5"} Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.876335 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrpb5" event={"ID":"807e5ccc-6b8a-4be7-8b8c-34888237e22d","Type":"ContainerStarted","Data":"676e6340d36ad1769d6d795830ec4b0950c123179c0d032568241a2e167f02ad"} Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.879268 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncdvx" event={"ID":"3d555627-e360-46ad-b6d9-ec1cc3ce1d68","Type":"ContainerStarted","Data":"d5442ece930673a860856c148c3302cb31832c9594f9beb2b6afdddaef57e6ea"} Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.879898 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.880082 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4kpp\" (UniqueName: \"kubernetes.io/projected/efbbb9d8-ccd1-40c9-a146-20dffe720203-kube-api-access-j4kpp\") pod \"redhat-marketplace-x9z86\" (UID: \"efbbb9d8-ccd1-40c9-a146-20dffe720203\") " pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.880146 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efbbb9d8-ccd1-40c9-a146-20dffe720203-utilities\") pod \"redhat-marketplace-x9z86\" (UID: \"efbbb9d8-ccd1-40c9-a146-20dffe720203\") " pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.880222 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efbbb9d8-ccd1-40c9-a146-20dffe720203-catalog-content\") pod \"redhat-marketplace-x9z86\" (UID: \"efbbb9d8-ccd1-40c9-a146-20dffe720203\") " pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:00:10 crc kubenswrapper[4807]: E1202 20:00:10.880332 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:11.380316109 +0000 UTC m=+146.681223604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.885471 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c4hb8" event={"ID":"3a3480f6-d49a-4f37-9f9a-605d9efc851e","Type":"ContainerStarted","Data":"e7fe19b30c7049c56fac7d67c810f93bfc815887d9a7c7f8fc28638e55d4fbfe"} Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.897784 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-cbpn8" event={"ID":"3d3a8832-4745-4b77-b855-edea43d079d4","Type":"ContainerStarted","Data":"1e42654ed80f0b8756f668a41e31cac405396f4ddc750220f209ff5f27073546"} Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.908838 4807 generic.go:334] "Generic (PLEG): container finished" podID="21184a59-8520-4dd0-b459-a056b42e852d" containerID="bde1a117bdec2e795c46c481b632482a9e427cbd22e34d186220a4ed36c094cf" exitCode=0 Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.909490 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmdbd" event={"ID":"21184a59-8520-4dd0-b459-a056b42e852d","Type":"ContainerDied","Data":"bde1a117bdec2e795c46c481b632482a9e427cbd22e34d186220a4ed36c094cf"} Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.909511 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmdbd" event={"ID":"21184a59-8520-4dd0-b459-a056b42e852d","Type":"ContainerStarted","Data":"16567e25bb71449b6db8e18973b71a7094f024e0f93f8d53132b0377518a795b"} Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.950495 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-cbpn8" podStartSLOduration=125.950470437 podStartE2EDuration="2m5.950470437s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:10.943929264 +0000 UTC m=+146.244836759" watchObservedRunningTime="2025-12-02 20:00:10.950470437 +0000 UTC m=+146.251377932" Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.989029 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4kpp\" (UniqueName: \"kubernetes.io/projected/efbbb9d8-ccd1-40c9-a146-20dffe720203-kube-api-access-j4kpp\") pod \"redhat-marketplace-x9z86\" (UID: \"efbbb9d8-ccd1-40c9-a146-20dffe720203\") " pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.989339 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efbbb9d8-ccd1-40c9-a146-20dffe720203-utilities\") pod \"redhat-marketplace-x9z86\" (UID: \"efbbb9d8-ccd1-40c9-a146-20dffe720203\") " pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.989581 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.989911 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efbbb9d8-ccd1-40c9-a146-20dffe720203-catalog-content\") pod \"redhat-marketplace-x9z86\" (UID: \"efbbb9d8-ccd1-40c9-a146-20dffe720203\") " pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:00:10 crc kubenswrapper[4807]: I1202 20:00:10.992257 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efbbb9d8-ccd1-40c9-a146-20dffe720203-catalog-content\") pod \"redhat-marketplace-x9z86\" (UID: \"efbbb9d8-ccd1-40c9-a146-20dffe720203\") " pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:10.996939 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efbbb9d8-ccd1-40c9-a146-20dffe720203-utilities\") pod \"redhat-marketplace-x9z86\" (UID: \"efbbb9d8-ccd1-40c9-a146-20dffe720203\") " pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:00:11 crc kubenswrapper[4807]: E1202 20:00:11.000938 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:11.500920743 +0000 UTC m=+146.801828228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.033706 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4kpp\" (UniqueName: \"kubernetes.io/projected/efbbb9d8-ccd1-40c9-a146-20dffe720203-kube-api-access-j4kpp\") pod \"redhat-marketplace-x9z86\" (UID: \"efbbb9d8-ccd1-40c9-a146-20dffe720203\") " pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.100383 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:11 crc kubenswrapper[4807]: E1202 20:00:11.100892 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 20:00:11.600863431 +0000 UTC m=+146.901770926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.170674 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.183034 4807 patch_prober.go:28] interesting pod/router-default-5444994796-2kvcm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 20:00:11 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Dec 02 20:00:11 crc kubenswrapper[4807]: [+]process-running ok Dec 02 20:00:11 crc kubenswrapper[4807]: healthz check failed Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.183101 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2kvcm" podUID="c3b0b579-6db1-463a-b391-6799f284b89a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.193220 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k5smb"] Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.194690 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.202989 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:11 crc kubenswrapper[4807]: E1202 20:00:11.203392 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 20:00:11.703376639 +0000 UTC m=+147.004284134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jh4tx" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.205101 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5smb"] Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.227351 4807 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-02T20:00:10.862812828Z","Handler":null,"Name":""} Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.233325 4807 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.233371 4807 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.305016 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.305752 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-utilities\") pod \"redhat-marketplace-k5smb\" (UID: \"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58\") " pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.305852 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-catalog-content\") pod \"redhat-marketplace-k5smb\" (UID: \"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58\") " pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.305876 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8vjr\" (UniqueName: \"kubernetes.io/projected/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-kube-api-access-r8vjr\") pod \"redhat-marketplace-k5smb\" (UID: \"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58\") " pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.312424 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.387617 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x9z86"] Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.407192 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-catalog-content\") pod \"redhat-marketplace-k5smb\" (UID: \"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58\") " pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.407255 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8vjr\" (UniqueName: \"kubernetes.io/projected/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-kube-api-access-r8vjr\") pod \"redhat-marketplace-k5smb\" (UID: \"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58\") " pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.407331 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.407386 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-utilities\") pod \"redhat-marketplace-k5smb\" (UID: \"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58\") " pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.407827 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-catalog-content\") pod \"redhat-marketplace-k5smb\" (UID: \"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58\") " pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.408037 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-utilities\") pod \"redhat-marketplace-k5smb\" (UID: \"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58\") " pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.410405 4807 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.410435 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.428612 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8vjr\" (UniqueName: \"kubernetes.io/projected/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-kube-api-access-r8vjr\") pod \"redhat-marketplace-k5smb\" (UID: \"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58\") " pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.512148 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.795488 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9mxjq"] Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.796902 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.801059 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.808989 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9mxjq"] Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.821145 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jh4tx\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.914549 4807 generic.go:334] "Generic (PLEG): container finished" podID="3d555627-e360-46ad-b6d9-ec1cc3ce1d68" containerID="97297ca81079be25d529bd22530880619781b1f002a7a085492c30343c0d2ad5" exitCode=0 Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.914608 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncdvx" event={"ID":"3d555627-e360-46ad-b6d9-ec1cc3ce1d68","Type":"ContainerDied","Data":"97297ca81079be25d529bd22530880619781b1f002a7a085492c30343c0d2ad5"} Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.915555 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhtv5\" (UniqueName: \"kubernetes.io/projected/40b1a65e-4886-466b-81ca-387c4b36310a-kube-api-access-dhtv5\") pod \"redhat-operators-9mxjq\" (UID: \"40b1a65e-4886-466b-81ca-387c4b36310a\") " pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.915755 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40b1a65e-4886-466b-81ca-387c4b36310a-catalog-content\") pod \"redhat-operators-9mxjq\" (UID: \"40b1a65e-4886-466b-81ca-387c4b36310a\") " pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.915812 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40b1a65e-4886-466b-81ca-387c4b36310a-utilities\") pod \"redhat-operators-9mxjq\" (UID: \"40b1a65e-4886-466b-81ca-387c4b36310a\") " pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.917338 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x9z86" event={"ID":"efbbb9d8-ccd1-40c9-a146-20dffe720203","Type":"ContainerStarted","Data":"119dc2828f7a6d180ea30edb3377d13e95b9421260e9739f816fa27bfac0a24d"} Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.918965 4807 generic.go:334] "Generic (PLEG): container finished" podID="095ad74d-f5f1-44f3-9007-c779f4f06f62" containerID="f8103470c6692017c6e127fc486d5130985f6e37c306bf11625fd3d4a3626348" exitCode=0 Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.919031 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc74n" event={"ID":"095ad74d-f5f1-44f3-9007-c779f4f06f62","Type":"ContainerDied","Data":"f8103470c6692017c6e127fc486d5130985f6e37c306bf11625fd3d4a3626348"} Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.922817 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" event={"ID":"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df","Type":"ContainerStarted","Data":"aa33fa11aa45786caa63204dff2f80a446dca80d0e5b3f1feb7e7fb09273d86e"} Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.925585 4807 generic.go:334] "Generic (PLEG): container finished" podID="807e5ccc-6b8a-4be7-8b8c-34888237e22d" containerID="941459d432d6503c4a09073baa42647086df73febd836a0a79e4be11e2601240" exitCode=0 Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.925661 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrpb5" event={"ID":"807e5ccc-6b8a-4be7-8b8c-34888237e22d","Type":"ContainerDied","Data":"941459d432d6503c4a09073baa42647086df73febd836a0a79e4be11e2601240"} Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.942565 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-cmwzb" Dec 02 20:00:11 crc kubenswrapper[4807]: I1202 20:00:11.961521 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5smb"] Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.017257 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhtv5\" (UniqueName: \"kubernetes.io/projected/40b1a65e-4886-466b-81ca-387c4b36310a-kube-api-access-dhtv5\") pod \"redhat-operators-9mxjq\" (UID: \"40b1a65e-4886-466b-81ca-387c4b36310a\") " pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.017333 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.017385 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.017426 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.017514 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40b1a65e-4886-466b-81ca-387c4b36310a-catalog-content\") pod \"redhat-operators-9mxjq\" (UID: \"40b1a65e-4886-466b-81ca-387c4b36310a\") " pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.017662 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40b1a65e-4886-466b-81ca-387c4b36310a-utilities\") pod \"redhat-operators-9mxjq\" (UID: \"40b1a65e-4886-466b-81ca-387c4b36310a\") " pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.019504 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40b1a65e-4886-466b-81ca-387c4b36310a-catalog-content\") pod \"redhat-operators-9mxjq\" (UID: \"40b1a65e-4886-466b-81ca-387c4b36310a\") " pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.021409 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40b1a65e-4886-466b-81ca-387c4b36310a-utilities\") pod \"redhat-operators-9mxjq\" (UID: \"40b1a65e-4886-466b-81ca-387c4b36310a\") " pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.022176 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.035128 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.042517 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.066773 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.071533 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhtv5\" (UniqueName: \"kubernetes.io/projected/40b1a65e-4886-466b-81ca-387c4b36310a-kube-api-access-dhtv5\") pod \"redhat-operators-9mxjq\" (UID: \"40b1a65e-4886-466b-81ca-387c4b36310a\") " pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.118591 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.123318 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.125642 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.188167 4807 patch_prober.go:28] interesting pod/router-default-5444994796-2kvcm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 20:00:12 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Dec 02 20:00:12 crc kubenswrapper[4807]: [+]process-running ok Dec 02 20:00:12 crc kubenswrapper[4807]: healthz check failed Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.188317 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2kvcm" podUID="c3b0b579-6db1-463a-b391-6799f284b89a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.205043 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7hz9f"] Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.206775 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.207873 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.221103 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.224852 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7hz9f"] Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.228027 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.322937 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-utilities\") pod \"redhat-operators-7hz9f\" (UID: \"49f3d78d-194b-4b23-9bbc-cd89cd6fe402\") " pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.323408 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg7fj\" (UniqueName: \"kubernetes.io/projected/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-kube-api-access-rg7fj\") pod \"redhat-operators-7hz9f\" (UID: \"49f3d78d-194b-4b23-9bbc-cd89cd6fe402\") " pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.323445 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-catalog-content\") pod \"redhat-operators-7hz9f\" (UID: \"49f3d78d-194b-4b23-9bbc-cd89cd6fe402\") " pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.334801 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.343452 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.345871 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-p49z8" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.345960 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.349187 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.349521 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.370211 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.433682 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-utilities\") pod \"redhat-operators-7hz9f\" (UID: \"49f3d78d-194b-4b23-9bbc-cd89cd6fe402\") " pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.435042 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2991ea07-6f72-4a74-bbde-6f55eec399d5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2991ea07-6f72-4a74-bbde-6f55eec399d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.435073 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg7fj\" (UniqueName: \"kubernetes.io/projected/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-kube-api-access-rg7fj\") pod \"redhat-operators-7hz9f\" (UID: \"49f3d78d-194b-4b23-9bbc-cd89cd6fe402\") " pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.435115 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2991ea07-6f72-4a74-bbde-6f55eec399d5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2991ea07-6f72-4a74-bbde-6f55eec399d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.435134 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-catalog-content\") pod \"redhat-operators-7hz9f\" (UID: \"49f3d78d-194b-4b23-9bbc-cd89cd6fe402\") " pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.439428 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-utilities\") pod \"redhat-operators-7hz9f\" (UID: \"49f3d78d-194b-4b23-9bbc-cd89cd6fe402\") " pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.440103 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-catalog-content\") pod \"redhat-operators-7hz9f\" (UID: \"49f3d78d-194b-4b23-9bbc-cd89cd6fe402\") " pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.470393 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jh4tx"] Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.500138 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg7fj\" (UniqueName: \"kubernetes.io/projected/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-kube-api-access-rg7fj\") pod \"redhat-operators-7hz9f\" (UID: \"49f3d78d-194b-4b23-9bbc-cd89cd6fe402\") " pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.536892 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2991ea07-6f72-4a74-bbde-6f55eec399d5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2991ea07-6f72-4a74-bbde-6f55eec399d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.537701 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2991ea07-6f72-4a74-bbde-6f55eec399d5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2991ea07-6f72-4a74-bbde-6f55eec399d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.537792 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2991ea07-6f72-4a74-bbde-6f55eec399d5-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2991ea07-6f72-4a74-bbde-6f55eec399d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.577781 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2991ea07-6f72-4a74-bbde-6f55eec399d5-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2991ea07-6f72-4a74-bbde-6f55eec399d5\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.668794 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.718292 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.727190 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9mxjq"] Dec 02 20:00:12 crc kubenswrapper[4807]: W1202 20:00:12.790822 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40b1a65e_4886_466b_81ca_387c4b36310a.slice/crio-147ef5c89da0f34da202ad96035e01ed08d4dd129fbd984cc174d24612deddb8 WatchSource:0}: Error finding container 147ef5c89da0f34da202ad96035e01ed08d4dd129fbd984cc174d24612deddb8: Status 404 returned error can't find the container with id 147ef5c89da0f34da202ad96035e01ed08d4dd129fbd984cc174d24612deddb8 Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.971128 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" event={"ID":"a8f3efef-a103-4813-94ed-1c9bd0113f84","Type":"ContainerStarted","Data":"bc30733a1c9196bc6cb498f46ac3079719bdecc3dee95d5b91a27d31084fe34c"} Dec 02 20:00:12 crc kubenswrapper[4807]: I1202 20:00:12.971198 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" event={"ID":"a8f3efef-a103-4813-94ed-1c9bd0113f84","Type":"ContainerStarted","Data":"e18d639997782f9f1aa5f01c8d86e3c2bbd76edc065476bf43ac3dbeb378502f"} Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.010520 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.012735 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mxjq" event={"ID":"40b1a65e-4886-466b-81ca-387c4b36310a","Type":"ContainerStarted","Data":"147ef5c89da0f34da202ad96035e01ed08d4dd129fbd984cc174d24612deddb8"} Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.012822 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.019903 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" podStartSLOduration=128.019877588 podStartE2EDuration="2m8.019877588s" podCreationTimestamp="2025-12-02 19:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:13.018558787 +0000 UTC m=+148.319466282" watchObservedRunningTime="2025-12-02 20:00:13.019877588 +0000 UTC m=+148.320785083" Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.036068 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" event={"ID":"bada0ce4-b0fb-4fc8-a30d-4756b0c0a0df","Type":"ContainerStarted","Data":"705c5cdefff7b71a7fa490987fcb85e5532f0f706a4f734385b717f2f662dd3d"} Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.057484 4807 generic.go:334] "Generic (PLEG): container finished" podID="3bc7f08d-6984-4bab-9220-761b68fdec0d" containerID="1d0bedb05fdcad8ad84c571471aa13395eb036822ff5a066e63600cd3240a83c" exitCode=0 Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.057584 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" event={"ID":"3bc7f08d-6984-4bab-9220-761b68fdec0d","Type":"ContainerDied","Data":"1d0bedb05fdcad8ad84c571471aa13395eb036822ff5a066e63600cd3240a83c"} Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.074232 4807 generic.go:334] "Generic (PLEG): container finished" podID="2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" containerID="a9419ca6b706d1cfeeb1b8fdb8356a500420d02e25f987c6bde2b605e23846d1" exitCode=0 Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.074444 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-6j8qw" podStartSLOduration=12.07442631 podStartE2EDuration="12.07442631s" podCreationTimestamp="2025-12-02 20:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:13.072095075 +0000 UTC m=+148.373002570" watchObservedRunningTime="2025-12-02 20:00:13.07442631 +0000 UTC m=+148.375333805" Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.074618 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5smb" event={"ID":"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58","Type":"ContainerDied","Data":"a9419ca6b706d1cfeeb1b8fdb8356a500420d02e25f987c6bde2b605e23846d1"} Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.074741 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5smb" event={"ID":"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58","Type":"ContainerStarted","Data":"59f7ddd96a3472c0a126a7ba38ab97801836179cc2d4898c7c036b137ff2df0e"} Dec 02 20:00:13 crc kubenswrapper[4807]: W1202 20:00:13.121881 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-b90fcbfa54b7fb1aa499d84d25dbb752b7d46a4c4fbce955a6024cb17df0b8f2 WatchSource:0}: Error finding container b90fcbfa54b7fb1aa499d84d25dbb752b7d46a4c4fbce955a6024cb17df0b8f2: Status 404 returned error can't find the container with id b90fcbfa54b7fb1aa499d84d25dbb752b7d46a4c4fbce955a6024cb17df0b8f2 Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.148389 4807 generic.go:334] "Generic (PLEG): container finished" podID="efbbb9d8-ccd1-40c9-a146-20dffe720203" containerID="facc3a7f6d40e75813592078a8cced23d3589b74324a3cd6b0fde668b63503eb" exitCode=0 Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.148577 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x9z86" event={"ID":"efbbb9d8-ccd1-40c9-a146-20dffe720203","Type":"ContainerDied","Data":"facc3a7f6d40e75813592078a8cced23d3589b74324a3cd6b0fde668b63503eb"} Dec 02 20:00:13 crc kubenswrapper[4807]: W1202 20:00:13.155357 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-fde70a84fb7a84dbcd39b425ab2e8daff809efe1eb61d060dcd7ab0804a7bf75 WatchSource:0}: Error finding container fde70a84fb7a84dbcd39b425ab2e8daff809efe1eb61d060dcd7ab0804a7bf75: Status 404 returned error can't find the container with id fde70a84fb7a84dbcd39b425ab2e8daff809efe1eb61d060dcd7ab0804a7bf75 Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.186710 4807 patch_prober.go:28] interesting pod/router-default-5444994796-2kvcm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 20:00:13 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Dec 02 20:00:13 crc kubenswrapper[4807]: [+]process-running ok Dec 02 20:00:13 crc kubenswrapper[4807]: healthz check failed Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.186800 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2kvcm" podUID="c3b0b579-6db1-463a-b391-6799f284b89a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.434652 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.435083 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.450862 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.468059 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.468122 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.476471 4807 patch_prober.go:28] interesting pod/console-f9d7485db-bjmsc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.476525 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-bjmsc" podUID="d31ed2df-d4fa-4b71-a218-20d453f1d8cb" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.573589 4807 patch_prober.go:28] interesting pod/downloads-7954f5f757-frmzm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.573666 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-frmzm" podUID="b5784f10-10dd-4363-a50a-60d37b9c9ec5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.581187 4807 patch_prober.go:28] interesting pod/downloads-7954f5f757-frmzm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.581253 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-frmzm" podUID="b5784f10-10dd-4363-a50a-60d37b9c9ec5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.597189 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7hz9f"] Dec 02 20:00:13 crc kubenswrapper[4807]: I1202 20:00:13.657155 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 20:00:13 crc kubenswrapper[4807]: W1202 20:00:13.681636 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2991ea07_6f72_4a74_bbde_6f55eec399d5.slice/crio-03adec855ce64e83332ae653896a30d2354919c96b6918e900e61afa7fb76629 WatchSource:0}: Error finding container 03adec855ce64e83332ae653896a30d2354919c96b6918e900e61afa7fb76629: Status 404 returned error can't find the container with id 03adec855ce64e83332ae653896a30d2354919c96b6918e900e61afa7fb76629 Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.168698 4807 generic.go:334] "Generic (PLEG): container finished" podID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" containerID="6343605f7e88645c656263b491054c069152317c128b3ba179edce8a6709f9cb" exitCode=0 Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.168771 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hz9f" event={"ID":"49f3d78d-194b-4b23-9bbc-cd89cd6fe402","Type":"ContainerDied","Data":"6343605f7e88645c656263b491054c069152317c128b3ba179edce8a6709f9cb"} Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.169097 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hz9f" event={"ID":"49f3d78d-194b-4b23-9bbc-cd89cd6fe402","Type":"ContainerStarted","Data":"4c03858aa1897863dfef8d7836b88db0d1e8ac8f3c9b75b93486db9df8e4cff4"} Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.173567 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2580f4c6afd63292c54aa928dd108ffbee5a3614ca5ef6eb26dded55e02f88a0"} Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.173601 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"bb2d7ac1989e059466618e66e903a874591854c840d6fde65039e5896da217df"} Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.175829 4807 generic.go:334] "Generic (PLEG): container finished" podID="40b1a65e-4886-466b-81ca-387c4b36310a" containerID="d56c820b0a0f8645d1976a3f89429956da3bcb636b45cb2ea26acd3a691bf1c3" exitCode=0 Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.175898 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mxjq" event={"ID":"40b1a65e-4886-466b-81ca-387c4b36310a","Type":"ContainerDied","Data":"d56c820b0a0f8645d1976a3f89429956da3bcb636b45cb2ea26acd3a691bf1c3"} Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.178400 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3de1f1da6e2eea95dfb8f4cc3861b589c4461391defb637c7eec27ed82ea07ba"} Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.178448 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b90fcbfa54b7fb1aa499d84d25dbb752b7d46a4c4fbce955a6024cb17df0b8f2"} Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.178675 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.178790 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.180108 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2991ea07-6f72-4a74-bbde-6f55eec399d5","Type":"ContainerStarted","Data":"03adec855ce64e83332ae653896a30d2354919c96b6918e900e61afa7fb76629"} Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.195528 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ac2877bb4586b6fc98ab82b915a9e933f41b5b5f91c9c11331eb118dfdeef0c4"} Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.195591 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"fde70a84fb7a84dbcd39b425ab2e8daff809efe1eb61d060dcd7ab0804a7bf75"} Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.197740 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.204189 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6lrj2" Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.784315 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.911099 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bc7f08d-6984-4bab-9220-761b68fdec0d-secret-volume\") pod \"3bc7f08d-6984-4bab-9220-761b68fdec0d\" (UID: \"3bc7f08d-6984-4bab-9220-761b68fdec0d\") " Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.911200 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bc7f08d-6984-4bab-9220-761b68fdec0d-config-volume\") pod \"3bc7f08d-6984-4bab-9220-761b68fdec0d\" (UID: \"3bc7f08d-6984-4bab-9220-761b68fdec0d\") " Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.911269 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfvpb\" (UniqueName: \"kubernetes.io/projected/3bc7f08d-6984-4bab-9220-761b68fdec0d-kube-api-access-rfvpb\") pod \"3bc7f08d-6984-4bab-9220-761b68fdec0d\" (UID: \"3bc7f08d-6984-4bab-9220-761b68fdec0d\") " Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.912197 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bc7f08d-6984-4bab-9220-761b68fdec0d-config-volume" (OuterVolumeSpecName: "config-volume") pod "3bc7f08d-6984-4bab-9220-761b68fdec0d" (UID: "3bc7f08d-6984-4bab-9220-761b68fdec0d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.918249 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bc7f08d-6984-4bab-9220-761b68fdec0d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3bc7f08d-6984-4bab-9220-761b68fdec0d" (UID: "3bc7f08d-6984-4bab-9220-761b68fdec0d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:00:14 crc kubenswrapper[4807]: I1202 20:00:14.930833 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc7f08d-6984-4bab-9220-761b68fdec0d-kube-api-access-rfvpb" (OuterVolumeSpecName: "kube-api-access-rfvpb") pod "3bc7f08d-6984-4bab-9220-761b68fdec0d" (UID: "3bc7f08d-6984-4bab-9220-761b68fdec0d"). InnerVolumeSpecName "kube-api-access-rfvpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:00:15 crc kubenswrapper[4807]: I1202 20:00:15.014451 4807 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bc7f08d-6984-4bab-9220-761b68fdec0d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 20:00:15 crc kubenswrapper[4807]: I1202 20:00:15.015382 4807 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bc7f08d-6984-4bab-9220-761b68fdec0d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 20:00:15 crc kubenswrapper[4807]: I1202 20:00:15.015395 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfvpb\" (UniqueName: \"kubernetes.io/projected/3bc7f08d-6984-4bab-9220-761b68fdec0d-kube-api-access-rfvpb\") on node \"crc\" DevicePath \"\"" Dec 02 20:00:15 crc kubenswrapper[4807]: I1202 20:00:15.275772 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" event={"ID":"3bc7f08d-6984-4bab-9220-761b68fdec0d","Type":"ContainerDied","Data":"4ac6d4667f3a27ba561f5f1687bfc97676bd0cc41edf2c332857c2aae9441326"} Dec 02 20:00:15 crc kubenswrapper[4807]: I1202 20:00:15.275842 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ac6d4667f3a27ba561f5f1687bfc97676bd0cc41edf2c332857c2aae9441326" Dec 02 20:00:15 crc kubenswrapper[4807]: I1202 20:00:15.275992 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6" Dec 02 20:00:15 crc kubenswrapper[4807]: I1202 20:00:15.327850 4807 generic.go:334] "Generic (PLEG): container finished" podID="2991ea07-6f72-4a74-bbde-6f55eec399d5" containerID="9edfbc7f1f2aa6a1756dd94c203758b842b0c340d2ebdfdeb17773b34d34a91e" exitCode=0 Dec 02 20:00:15 crc kubenswrapper[4807]: I1202 20:00:15.328289 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2991ea07-6f72-4a74-bbde-6f55eec399d5","Type":"ContainerDied","Data":"9edfbc7f1f2aa6a1756dd94c203758b842b0c340d2ebdfdeb17773b34d34a91e"} Dec 02 20:00:15 crc kubenswrapper[4807]: I1202 20:00:15.334297 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-2kvcm" Dec 02 20:00:16 crc kubenswrapper[4807]: I1202 20:00:16.669371 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 20:00:16 crc kubenswrapper[4807]: I1202 20:00:16.780347 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2991ea07-6f72-4a74-bbde-6f55eec399d5-kubelet-dir\") pod \"2991ea07-6f72-4a74-bbde-6f55eec399d5\" (UID: \"2991ea07-6f72-4a74-bbde-6f55eec399d5\") " Dec 02 20:00:16 crc kubenswrapper[4807]: I1202 20:00:16.780408 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2991ea07-6f72-4a74-bbde-6f55eec399d5-kube-api-access\") pod \"2991ea07-6f72-4a74-bbde-6f55eec399d5\" (UID: \"2991ea07-6f72-4a74-bbde-6f55eec399d5\") " Dec 02 20:00:16 crc kubenswrapper[4807]: I1202 20:00:16.780496 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2991ea07-6f72-4a74-bbde-6f55eec399d5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2991ea07-6f72-4a74-bbde-6f55eec399d5" (UID: "2991ea07-6f72-4a74-bbde-6f55eec399d5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:00:16 crc kubenswrapper[4807]: I1202 20:00:16.780755 4807 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2991ea07-6f72-4a74-bbde-6f55eec399d5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 20:00:16 crc kubenswrapper[4807]: I1202 20:00:16.791580 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2991ea07-6f72-4a74-bbde-6f55eec399d5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2991ea07-6f72-4a74-bbde-6f55eec399d5" (UID: "2991ea07-6f72-4a74-bbde-6f55eec399d5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:00:16 crc kubenswrapper[4807]: I1202 20:00:16.882315 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2991ea07-6f72-4a74-bbde-6f55eec399d5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.378921 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2991ea07-6f72-4a74-bbde-6f55eec399d5","Type":"ContainerDied","Data":"03adec855ce64e83332ae653896a30d2354919c96b6918e900e61afa7fb76629"} Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.378983 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03adec855ce64e83332ae653896a30d2354919c96b6918e900e61afa7fb76629" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.379056 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.642246 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 20:00:17 crc kubenswrapper[4807]: E1202 20:00:17.642615 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc7f08d-6984-4bab-9220-761b68fdec0d" containerName="collect-profiles" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.642669 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc7f08d-6984-4bab-9220-761b68fdec0d" containerName="collect-profiles" Dec 02 20:00:17 crc kubenswrapper[4807]: E1202 20:00:17.642688 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2991ea07-6f72-4a74-bbde-6f55eec399d5" containerName="pruner" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.642695 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2991ea07-6f72-4a74-bbde-6f55eec399d5" containerName="pruner" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.643794 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc7f08d-6984-4bab-9220-761b68fdec0d" containerName="collect-profiles" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.643813 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="2991ea07-6f72-4a74-bbde-6f55eec399d5" containerName="pruner" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.644294 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.649209 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.653346 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.653609 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.799814 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.799891 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.901459 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.901552 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.901644 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.918059 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 20:00:17 crc kubenswrapper[4807]: I1202 20:00:17.973894 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 20:00:18 crc kubenswrapper[4807]: I1202 20:00:18.491089 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 20:00:18 crc kubenswrapper[4807]: W1202 20:00:18.518955 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podab3ef1d1_ba7f_4471_a0bc_fd0df06f2130.slice/crio-457c486eaa898d00f9ca31b6995663bdece4e6ad7e8a670491fc923270619443 WatchSource:0}: Error finding container 457c486eaa898d00f9ca31b6995663bdece4e6ad7e8a670491fc923270619443: Status 404 returned error can't find the container with id 457c486eaa898d00f9ca31b6995663bdece4e6ad7e8a670491fc923270619443 Dec 02 20:00:19 crc kubenswrapper[4807]: I1202 20:00:19.458060 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130","Type":"ContainerStarted","Data":"457c486eaa898d00f9ca31b6995663bdece4e6ad7e8a670491fc923270619443"} Dec 02 20:00:19 crc kubenswrapper[4807]: I1202 20:00:19.623600 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-b8g8z" Dec 02 20:00:21 crc kubenswrapper[4807]: I1202 20:00:21.478527 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130","Type":"ContainerStarted","Data":"29018a98d50967685d56e29f297f2013a549f2adf2a3a35ac55ca1701c36b35d"} Dec 02 20:00:22 crc kubenswrapper[4807]: I1202 20:00:22.505138 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=5.505114162 podStartE2EDuration="5.505114162s" podCreationTimestamp="2025-12-02 20:00:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:00:22.501871165 +0000 UTC m=+157.802778660" watchObservedRunningTime="2025-12-02 20:00:22.505114162 +0000 UTC m=+157.806021657" Dec 02 20:00:23 crc kubenswrapper[4807]: I1202 20:00:23.493699 4807 generic.go:334] "Generic (PLEG): container finished" podID="ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130" containerID="29018a98d50967685d56e29f297f2013a549f2adf2a3a35ac55ca1701c36b35d" exitCode=0 Dec 02 20:00:23 crc kubenswrapper[4807]: I1202 20:00:23.493765 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130","Type":"ContainerDied","Data":"29018a98d50967685d56e29f297f2013a549f2adf2a3a35ac55ca1701c36b35d"} Dec 02 20:00:23 crc kubenswrapper[4807]: I1202 20:00:23.577647 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-frmzm" Dec 02 20:00:23 crc kubenswrapper[4807]: I1202 20:00:23.930061 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:23 crc kubenswrapper[4807]: I1202 20:00:23.934463 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:00:27 crc kubenswrapper[4807]: I1202 20:00:27.170590 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs\") pod \"network-metrics-daemon-7z9t6\" (UID: \"1cb49a08-30b0-4353-ad4a-23362f281475\") " pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 20:00:27 crc kubenswrapper[4807]: I1202 20:00:27.179845 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1cb49a08-30b0-4353-ad4a-23362f281475-metrics-certs\") pod \"network-metrics-daemon-7z9t6\" (UID: \"1cb49a08-30b0-4353-ad4a-23362f281475\") " pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 20:00:27 crc kubenswrapper[4807]: I1202 20:00:27.200473 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7z9t6" Dec 02 20:00:28 crc kubenswrapper[4807]: I1202 20:00:28.292795 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:00:28 crc kubenswrapper[4807]: I1202 20:00:28.293410 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:00:32 crc kubenswrapper[4807]: I1202 20:00:32.073804 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:00:35 crc kubenswrapper[4807]: I1202 20:00:35.123455 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 20:00:35 crc kubenswrapper[4807]: I1202 20:00:35.194905 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130-kubelet-dir\") pod \"ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130\" (UID: \"ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130\") " Dec 02 20:00:35 crc kubenswrapper[4807]: I1202 20:00:35.194977 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130-kube-api-access\") pod \"ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130\" (UID: \"ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130\") " Dec 02 20:00:35 crc kubenswrapper[4807]: I1202 20:00:35.195082 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130" (UID: "ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:00:35 crc kubenswrapper[4807]: I1202 20:00:35.195371 4807 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 20:00:35 crc kubenswrapper[4807]: I1202 20:00:35.201528 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130" (UID: "ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:00:35 crc kubenswrapper[4807]: I1202 20:00:35.296705 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 20:00:35 crc kubenswrapper[4807]: I1202 20:00:35.574203 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130","Type":"ContainerDied","Data":"457c486eaa898d00f9ca31b6995663bdece4e6ad7e8a670491fc923270619443"} Dec 02 20:00:35 crc kubenswrapper[4807]: I1202 20:00:35.574251 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 20:00:35 crc kubenswrapper[4807]: I1202 20:00:35.574253 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="457c486eaa898d00f9ca31b6995663bdece4e6ad7e8a670491fc923270619443" Dec 02 20:00:43 crc kubenswrapper[4807]: I1202 20:00:43.540997 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cfdkz" Dec 02 20:00:53 crc kubenswrapper[4807]: I1202 20:00:53.528444 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 20:00:54 crc kubenswrapper[4807]: I1202 20:00:54.830461 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 20:00:54 crc kubenswrapper[4807]: E1202 20:00:54.831180 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130" containerName="pruner" Dec 02 20:00:54 crc kubenswrapper[4807]: I1202 20:00:54.831205 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130" containerName="pruner" Dec 02 20:00:54 crc kubenswrapper[4807]: I1202 20:00:54.831376 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab3ef1d1-ba7f-4471-a0bc-fd0df06f2130" containerName="pruner" Dec 02 20:00:54 crc kubenswrapper[4807]: I1202 20:00:54.832001 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 20:00:54 crc kubenswrapper[4807]: I1202 20:00:54.835011 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 20:00:54 crc kubenswrapper[4807]: I1202 20:00:54.835028 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 20:00:54 crc kubenswrapper[4807]: I1202 20:00:54.848282 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 20:00:54 crc kubenswrapper[4807]: I1202 20:00:54.888444 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a4a8446-37af-4473-9200-aafe3b5cefd9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6a4a8446-37af-4473-9200-aafe3b5cefd9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 20:00:54 crc kubenswrapper[4807]: I1202 20:00:54.888579 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a4a8446-37af-4473-9200-aafe3b5cefd9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6a4a8446-37af-4473-9200-aafe3b5cefd9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 20:00:54 crc kubenswrapper[4807]: I1202 20:00:54.990451 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a4a8446-37af-4473-9200-aafe3b5cefd9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6a4a8446-37af-4473-9200-aafe3b5cefd9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 20:00:54 crc kubenswrapper[4807]: I1202 20:00:54.990574 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a4a8446-37af-4473-9200-aafe3b5cefd9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6a4a8446-37af-4473-9200-aafe3b5cefd9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 20:00:54 crc kubenswrapper[4807]: I1202 20:00:54.990599 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a4a8446-37af-4473-9200-aafe3b5cefd9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6a4a8446-37af-4473-9200-aafe3b5cefd9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 20:00:55 crc kubenswrapper[4807]: I1202 20:00:55.015883 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a4a8446-37af-4473-9200-aafe3b5cefd9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6a4a8446-37af-4473-9200-aafe3b5cefd9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 20:00:55 crc kubenswrapper[4807]: I1202 20:00:55.150917 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 20:00:58 crc kubenswrapper[4807]: I1202 20:00:58.293313 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:00:58 crc kubenswrapper[4807]: I1202 20:00:58.293683 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:00:59 crc kubenswrapper[4807]: I1202 20:00:59.832388 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 20:00:59 crc kubenswrapper[4807]: I1202 20:00:59.833163 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 20:00:59 crc kubenswrapper[4807]: I1202 20:00:59.838308 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 20:00:59 crc kubenswrapper[4807]: I1202 20:00:59.965422 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f888ee60-73ba-421a-9262-bc59ba72820f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f888ee60-73ba-421a-9262-bc59ba72820f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 20:00:59 crc kubenswrapper[4807]: I1202 20:00:59.965498 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f888ee60-73ba-421a-9262-bc59ba72820f-var-lock\") pod \"installer-9-crc\" (UID: \"f888ee60-73ba-421a-9262-bc59ba72820f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 20:00:59 crc kubenswrapper[4807]: I1202 20:00:59.965546 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f888ee60-73ba-421a-9262-bc59ba72820f-kube-api-access\") pod \"installer-9-crc\" (UID: \"f888ee60-73ba-421a-9262-bc59ba72820f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 20:01:00 crc kubenswrapper[4807]: I1202 20:01:00.066396 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f888ee60-73ba-421a-9262-bc59ba72820f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f888ee60-73ba-421a-9262-bc59ba72820f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 20:01:00 crc kubenswrapper[4807]: I1202 20:01:00.066477 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f888ee60-73ba-421a-9262-bc59ba72820f-var-lock\") pod \"installer-9-crc\" (UID: \"f888ee60-73ba-421a-9262-bc59ba72820f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 20:01:00 crc kubenswrapper[4807]: I1202 20:01:00.066553 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f888ee60-73ba-421a-9262-bc59ba72820f-kube-api-access\") pod \"installer-9-crc\" (UID: \"f888ee60-73ba-421a-9262-bc59ba72820f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 20:01:00 crc kubenswrapper[4807]: I1202 20:01:00.066565 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f888ee60-73ba-421a-9262-bc59ba72820f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f888ee60-73ba-421a-9262-bc59ba72820f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 20:01:00 crc kubenswrapper[4807]: I1202 20:01:00.066595 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f888ee60-73ba-421a-9262-bc59ba72820f-var-lock\") pod \"installer-9-crc\" (UID: \"f888ee60-73ba-421a-9262-bc59ba72820f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 20:01:00 crc kubenswrapper[4807]: I1202 20:01:00.085190 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f888ee60-73ba-421a-9262-bc59ba72820f-kube-api-access\") pod \"installer-9-crc\" (UID: \"f888ee60-73ba-421a-9262-bc59ba72820f\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 20:01:00 crc kubenswrapper[4807]: I1202 20:01:00.167496 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 20:01:06 crc kubenswrapper[4807]: E1202 20:01:06.033747 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 02 20:01:06 crc kubenswrapper[4807]: E1202 20:01:06.034380 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jkbcd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-lmdbd_openshift-marketplace(21184a59-8520-4dd0-b459-a056b42e852d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 20:01:06 crc kubenswrapper[4807]: E1202 20:01:06.036052 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-lmdbd" podUID="21184a59-8520-4dd0-b459-a056b42e852d" Dec 02 20:01:07 crc kubenswrapper[4807]: E1202 20:01:07.348101 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 02 20:01:07 crc kubenswrapper[4807]: E1202 20:01:07.348689 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8vjr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-k5smb_openshift-marketplace(2f05f7b5-9f46-48c9-9b4f-a2497e87cc58): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 20:01:07 crc kubenswrapper[4807]: E1202 20:01:07.350034 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-k5smb" podUID="2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" Dec 02 20:01:09 crc kubenswrapper[4807]: E1202 20:01:09.828551 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-lmdbd" podUID="21184a59-8520-4dd0-b459-a056b42e852d" Dec 02 20:01:09 crc kubenswrapper[4807]: E1202 20:01:09.828640 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-k5smb" podUID="2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" Dec 02 20:01:09 crc kubenswrapper[4807]: E1202 20:01:09.926582 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 02 20:01:09 crc kubenswrapper[4807]: E1202 20:01:09.926751 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j4kpp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-x9z86_openshift-marketplace(efbbb9d8-ccd1-40c9-a146-20dffe720203): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 20:01:09 crc kubenswrapper[4807]: E1202 20:01:09.927665 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 02 20:01:09 crc kubenswrapper[4807]: E1202 20:01:09.927826 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-95trv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ncdvx_openshift-marketplace(3d555627-e360-46ad-b6d9-ec1cc3ce1d68): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 20:01:09 crc kubenswrapper[4807]: E1202 20:01:09.927902 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-x9z86" podUID="efbbb9d8-ccd1-40c9-a146-20dffe720203" Dec 02 20:01:09 crc kubenswrapper[4807]: E1202 20:01:09.929830 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ncdvx" podUID="3d555627-e360-46ad-b6d9-ec1cc3ce1d68" Dec 02 20:01:09 crc kubenswrapper[4807]: E1202 20:01:09.946999 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 02 20:01:09 crc kubenswrapper[4807]: E1202 20:01:09.947149 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mz2sr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hrpb5_openshift-marketplace(807e5ccc-6b8a-4be7-8b8c-34888237e22d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 20:01:09 crc kubenswrapper[4807]: E1202 20:01:09.948707 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hrpb5" podUID="807e5ccc-6b8a-4be7-8b8c-34888237e22d" Dec 02 20:01:09 crc kubenswrapper[4807]: E1202 20:01:09.960746 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 02 20:01:09 crc kubenswrapper[4807]: E1202 20:01:09.960919 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wjk9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dc74n_openshift-marketplace(095ad74d-f5f1-44f3-9007-c779f4f06f62): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 20:01:09 crc kubenswrapper[4807]: E1202 20:01:09.962116 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dc74n" podUID="095ad74d-f5f1-44f3-9007-c779f4f06f62" Dec 02 20:01:13 crc kubenswrapper[4807]: E1202 20:01:13.120047 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dc74n" podUID="095ad74d-f5f1-44f3-9007-c779f4f06f62" Dec 02 20:01:13 crc kubenswrapper[4807]: E1202 20:01:13.120047 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ncdvx" podUID="3d555627-e360-46ad-b6d9-ec1cc3ce1d68" Dec 02 20:01:13 crc kubenswrapper[4807]: E1202 20:01:13.120102 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-x9z86" podUID="efbbb9d8-ccd1-40c9-a146-20dffe720203" Dec 02 20:01:13 crc kubenswrapper[4807]: E1202 20:01:13.120201 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hrpb5" podUID="807e5ccc-6b8a-4be7-8b8c-34888237e22d" Dec 02 20:01:13 crc kubenswrapper[4807]: E1202 20:01:13.244859 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 02 20:01:13 crc kubenswrapper[4807]: E1202 20:01:13.245318 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rg7fj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7hz9f_openshift-marketplace(49f3d78d-194b-4b23-9bbc-cd89cd6fe402): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 20:01:13 crc kubenswrapper[4807]: E1202 20:01:13.246973 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-7hz9f" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" Dec 02 20:01:13 crc kubenswrapper[4807]: E1202 20:01:13.332435 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 02 20:01:13 crc kubenswrapper[4807]: E1202 20:01:13.332618 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dhtv5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-9mxjq_openshift-marketplace(40b1a65e-4886-466b-81ca-387c4b36310a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 20:01:13 crc kubenswrapper[4807]: E1202 20:01:13.334680 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-9mxjq" podUID="40b1a65e-4886-466b-81ca-387c4b36310a" Dec 02 20:01:13 crc kubenswrapper[4807]: I1202 20:01:13.561894 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 20:01:13 crc kubenswrapper[4807]: I1202 20:01:13.624315 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 20:01:13 crc kubenswrapper[4807]: I1202 20:01:13.647561 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7z9t6"] Dec 02 20:01:13 crc kubenswrapper[4807]: I1202 20:01:13.784755 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" event={"ID":"1cb49a08-30b0-4353-ad4a-23362f281475","Type":"ContainerStarted","Data":"f8ade52d35fc5ed08f54723fd77f7fb9cd834904c651dc46d82b3d7a14d17359"} Dec 02 20:01:13 crc kubenswrapper[4807]: I1202 20:01:13.787145 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f888ee60-73ba-421a-9262-bc59ba72820f","Type":"ContainerStarted","Data":"b8f16bfcc5228715a4951b4bc676056f8727edc89f1c5a9917924e3cecb11365"} Dec 02 20:01:13 crc kubenswrapper[4807]: I1202 20:01:13.790439 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6a4a8446-37af-4473-9200-aafe3b5cefd9","Type":"ContainerStarted","Data":"6e5b1347268c828739a50c5c829528b198bedc5e9b0617119563af34549a4e62"} Dec 02 20:01:13 crc kubenswrapper[4807]: E1202 20:01:13.791239 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7hz9f" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" Dec 02 20:01:13 crc kubenswrapper[4807]: E1202 20:01:13.795926 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-9mxjq" podUID="40b1a65e-4886-466b-81ca-387c4b36310a" Dec 02 20:01:14 crc kubenswrapper[4807]: I1202 20:01:14.797872 4807 generic.go:334] "Generic (PLEG): container finished" podID="6a4a8446-37af-4473-9200-aafe3b5cefd9" containerID="4e6925df9bce1b30e72f756a7e7dacad84b0ef30d2b404d8d77290dc1ee02596" exitCode=0 Dec 02 20:01:14 crc kubenswrapper[4807]: I1202 20:01:14.797983 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6a4a8446-37af-4473-9200-aafe3b5cefd9","Type":"ContainerDied","Data":"4e6925df9bce1b30e72f756a7e7dacad84b0ef30d2b404d8d77290dc1ee02596"} Dec 02 20:01:14 crc kubenswrapper[4807]: I1202 20:01:14.800582 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" event={"ID":"1cb49a08-30b0-4353-ad4a-23362f281475","Type":"ContainerStarted","Data":"38e4d156cbe62e04f89db247e193322f1be04f0170bbc41d0a478c1bb04d1ae6"} Dec 02 20:01:14 crc kubenswrapper[4807]: I1202 20:01:14.800637 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7z9t6" event={"ID":"1cb49a08-30b0-4353-ad4a-23362f281475","Type":"ContainerStarted","Data":"89bfa0b90fc256bdb9bbe4ed76ca3b706001343ad51355a6d16467a23d9e56c5"} Dec 02 20:01:14 crc kubenswrapper[4807]: I1202 20:01:14.802748 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f888ee60-73ba-421a-9262-bc59ba72820f","Type":"ContainerStarted","Data":"91ccc63f354fb4fc6d8f6167aacc7757404aea881055cfa2d5de2fc4a6325e13"} Dec 02 20:01:14 crc kubenswrapper[4807]: I1202 20:01:14.846673 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7z9t6" podStartSLOduration=190.846648095 podStartE2EDuration="3m10.846648095s" podCreationTimestamp="2025-12-02 19:58:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:01:14.84178508 +0000 UTC m=+210.142692575" watchObservedRunningTime="2025-12-02 20:01:14.846648095 +0000 UTC m=+210.147555590" Dec 02 20:01:14 crc kubenswrapper[4807]: I1202 20:01:14.864882 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=15.86485015 podStartE2EDuration="15.86485015s" podCreationTimestamp="2025-12-02 20:00:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:01:14.862216616 +0000 UTC m=+210.163124131" watchObservedRunningTime="2025-12-02 20:01:14.86485015 +0000 UTC m=+210.165757645" Dec 02 20:01:16 crc kubenswrapper[4807]: I1202 20:01:16.118192 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 20:01:16 crc kubenswrapper[4807]: I1202 20:01:16.224616 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a4a8446-37af-4473-9200-aafe3b5cefd9-kubelet-dir\") pod \"6a4a8446-37af-4473-9200-aafe3b5cefd9\" (UID: \"6a4a8446-37af-4473-9200-aafe3b5cefd9\") " Dec 02 20:01:16 crc kubenswrapper[4807]: I1202 20:01:16.224768 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a4a8446-37af-4473-9200-aafe3b5cefd9-kube-api-access\") pod \"6a4a8446-37af-4473-9200-aafe3b5cefd9\" (UID: \"6a4a8446-37af-4473-9200-aafe3b5cefd9\") " Dec 02 20:01:16 crc kubenswrapper[4807]: I1202 20:01:16.224899 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a4a8446-37af-4473-9200-aafe3b5cefd9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6a4a8446-37af-4473-9200-aafe3b5cefd9" (UID: "6a4a8446-37af-4473-9200-aafe3b5cefd9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:01:16 crc kubenswrapper[4807]: I1202 20:01:16.225150 4807 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a4a8446-37af-4473-9200-aafe3b5cefd9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:16 crc kubenswrapper[4807]: I1202 20:01:16.231167 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a4a8446-37af-4473-9200-aafe3b5cefd9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6a4a8446-37af-4473-9200-aafe3b5cefd9" (UID: "6a4a8446-37af-4473-9200-aafe3b5cefd9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:01:16 crc kubenswrapper[4807]: I1202 20:01:16.326737 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a4a8446-37af-4473-9200-aafe3b5cefd9-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:16 crc kubenswrapper[4807]: I1202 20:01:16.818058 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6a4a8446-37af-4473-9200-aafe3b5cefd9","Type":"ContainerDied","Data":"6e5b1347268c828739a50c5c829528b198bedc5e9b0617119563af34549a4e62"} Dec 02 20:01:16 crc kubenswrapper[4807]: I1202 20:01:16.818422 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e5b1347268c828739a50c5c829528b198bedc5e9b0617119563af34549a4e62" Dec 02 20:01:16 crc kubenswrapper[4807]: I1202 20:01:16.818189 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 20:01:23 crc kubenswrapper[4807]: I1202 20:01:23.966054 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v4lp2"] Dec 02 20:01:24 crc kubenswrapper[4807]: I1202 20:01:24.872839 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5smb" event={"ID":"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58","Type":"ContainerStarted","Data":"b29a477f800e87ca889e463d588bf066e1f90268995a2bb41a6dc4f93f3fbdc7"} Dec 02 20:01:24 crc kubenswrapper[4807]: I1202 20:01:24.879812 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmdbd" event={"ID":"21184a59-8520-4dd0-b459-a056b42e852d","Type":"ContainerStarted","Data":"fea6d7a742e85bb055296aae03bed7991cd4c0562259a20f906f4717857a339d"} Dec 02 20:01:25 crc kubenswrapper[4807]: I1202 20:01:25.890356 4807 generic.go:334] "Generic (PLEG): container finished" podID="2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" containerID="b29a477f800e87ca889e463d588bf066e1f90268995a2bb41a6dc4f93f3fbdc7" exitCode=0 Dec 02 20:01:25 crc kubenswrapper[4807]: I1202 20:01:25.890430 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5smb" event={"ID":"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58","Type":"ContainerDied","Data":"b29a477f800e87ca889e463d588bf066e1f90268995a2bb41a6dc4f93f3fbdc7"} Dec 02 20:01:25 crc kubenswrapper[4807]: I1202 20:01:25.898937 4807 generic.go:334] "Generic (PLEG): container finished" podID="efbbb9d8-ccd1-40c9-a146-20dffe720203" containerID="fd1c6b4f67d95230f03d9d1ffc859379359a05cc5213e20d1bba599e9a6a20df" exitCode=0 Dec 02 20:01:25 crc kubenswrapper[4807]: I1202 20:01:25.899027 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x9z86" event={"ID":"efbbb9d8-ccd1-40c9-a146-20dffe720203","Type":"ContainerDied","Data":"fd1c6b4f67d95230f03d9d1ffc859379359a05cc5213e20d1bba599e9a6a20df"} Dec 02 20:01:25 crc kubenswrapper[4807]: I1202 20:01:25.909293 4807 generic.go:334] "Generic (PLEG): container finished" podID="21184a59-8520-4dd0-b459-a056b42e852d" containerID="fea6d7a742e85bb055296aae03bed7991cd4c0562259a20f906f4717857a339d" exitCode=0 Dec 02 20:01:25 crc kubenswrapper[4807]: I1202 20:01:25.909423 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmdbd" event={"ID":"21184a59-8520-4dd0-b459-a056b42e852d","Type":"ContainerDied","Data":"fea6d7a742e85bb055296aae03bed7991cd4c0562259a20f906f4717857a339d"} Dec 02 20:01:25 crc kubenswrapper[4807]: I1202 20:01:25.935042 4807 generic.go:334] "Generic (PLEG): container finished" podID="095ad74d-f5f1-44f3-9007-c779f4f06f62" containerID="1d2531aee2690b81e86eba992d1c64080cc10e58392d8f7821438fe36251a555" exitCode=0 Dec 02 20:01:25 crc kubenswrapper[4807]: I1202 20:01:25.935090 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc74n" event={"ID":"095ad74d-f5f1-44f3-9007-c779f4f06f62","Type":"ContainerDied","Data":"1d2531aee2690b81e86eba992d1c64080cc10e58392d8f7821438fe36251a555"} Dec 02 20:01:26 crc kubenswrapper[4807]: I1202 20:01:26.941374 4807 generic.go:334] "Generic (PLEG): container finished" podID="3d555627-e360-46ad-b6d9-ec1cc3ce1d68" containerID="bbc2502ac4675046fa89b9b5f7f0bb6f9b0acf5b217ee938af6b833ae6778b51" exitCode=0 Dec 02 20:01:26 crc kubenswrapper[4807]: I1202 20:01:26.941561 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncdvx" event={"ID":"3d555627-e360-46ad-b6d9-ec1cc3ce1d68","Type":"ContainerDied","Data":"bbc2502ac4675046fa89b9b5f7f0bb6f9b0acf5b217ee938af6b833ae6778b51"} Dec 02 20:01:26 crc kubenswrapper[4807]: I1202 20:01:26.946228 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x9z86" event={"ID":"efbbb9d8-ccd1-40c9-a146-20dffe720203","Type":"ContainerStarted","Data":"65b8ff673a10a84f3060642c0d188b55ed72bc70ba45aa95b5e846cf784adf7e"} Dec 02 20:01:26 crc kubenswrapper[4807]: I1202 20:01:26.948459 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmdbd" event={"ID":"21184a59-8520-4dd0-b459-a056b42e852d","Type":"ContainerStarted","Data":"a92f4249ed10df51d6c96cd65f20b67547c3571bc4a738ba70cf0a4f0f0434f2"} Dec 02 20:01:26 crc kubenswrapper[4807]: I1202 20:01:26.950597 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc74n" event={"ID":"095ad74d-f5f1-44f3-9007-c779f4f06f62","Type":"ContainerStarted","Data":"47190befdc266a72d78ba4833808b6c46634a39515f1e84eba94015c02d8b96f"} Dec 02 20:01:26 crc kubenswrapper[4807]: I1202 20:01:26.952950 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5smb" event={"ID":"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58","Type":"ContainerStarted","Data":"c86a477c8cb29920e138db1ff3166f0486b552f6dcf60f8ca9d800c0935d4de7"} Dec 02 20:01:26 crc kubenswrapper[4807]: I1202 20:01:26.990031 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dc74n" podStartSLOduration=3.409924477 podStartE2EDuration="1m18.990016109s" podCreationTimestamp="2025-12-02 20:00:08 +0000 UTC" firstStartedPulling="2025-12-02 20:00:10.849984586 +0000 UTC m=+146.150892081" lastFinishedPulling="2025-12-02 20:01:26.430076218 +0000 UTC m=+221.730983713" observedRunningTime="2025-12-02 20:01:26.984625785 +0000 UTC m=+222.285533290" watchObservedRunningTime="2025-12-02 20:01:26.990016109 +0000 UTC m=+222.290923604" Dec 02 20:01:27 crc kubenswrapper[4807]: I1202 20:01:27.005831 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x9z86" podStartSLOduration=3.637640335 podStartE2EDuration="1m17.005815918s" podCreationTimestamp="2025-12-02 20:00:10 +0000 UTC" firstStartedPulling="2025-12-02 20:00:13.177977713 +0000 UTC m=+148.478885208" lastFinishedPulling="2025-12-02 20:01:26.546153296 +0000 UTC m=+221.847060791" observedRunningTime="2025-12-02 20:01:27.004704698 +0000 UTC m=+222.305612203" watchObservedRunningTime="2025-12-02 20:01:27.005815918 +0000 UTC m=+222.306723403" Dec 02 20:01:27 crc kubenswrapper[4807]: I1202 20:01:27.050779 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lmdbd" podStartSLOduration=3.5413489289999998 podStartE2EDuration="1m19.050757225s" podCreationTimestamp="2025-12-02 20:00:08 +0000 UTC" firstStartedPulling="2025-12-02 20:00:10.963471703 +0000 UTC m=+146.264379198" lastFinishedPulling="2025-12-02 20:01:26.472879999 +0000 UTC m=+221.773787494" observedRunningTime="2025-12-02 20:01:27.049762089 +0000 UTC m=+222.350669594" watchObservedRunningTime="2025-12-02 20:01:27.050757225 +0000 UTC m=+222.351664720" Dec 02 20:01:27 crc kubenswrapper[4807]: I1202 20:01:27.051369 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k5smb" podStartSLOduration=2.859091341 podStartE2EDuration="1m16.051362237s" podCreationTimestamp="2025-12-02 20:00:11 +0000 UTC" firstStartedPulling="2025-12-02 20:00:13.147013215 +0000 UTC m=+148.447920720" lastFinishedPulling="2025-12-02 20:01:26.339284121 +0000 UTC m=+221.640191616" observedRunningTime="2025-12-02 20:01:27.02617175 +0000 UTC m=+222.327079245" watchObservedRunningTime="2025-12-02 20:01:27.051362237 +0000 UTC m=+222.352269732" Dec 02 20:01:27 crc kubenswrapper[4807]: I1202 20:01:27.961774 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncdvx" event={"ID":"3d555627-e360-46ad-b6d9-ec1cc3ce1d68","Type":"ContainerStarted","Data":"f2c1c5cc22244c0809370cfa96ec52ccaf2f98306881848624f37b4f661683a2"} Dec 02 20:01:28 crc kubenswrapper[4807]: I1202 20:01:28.293407 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:01:28 crc kubenswrapper[4807]: I1202 20:01:28.293494 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:01:28 crc kubenswrapper[4807]: I1202 20:01:28.293555 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 20:01:28 crc kubenswrapper[4807]: I1202 20:01:28.294259 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 20:01:28 crc kubenswrapper[4807]: I1202 20:01:28.294401 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079" gracePeriod=600 Dec 02 20:01:28 crc kubenswrapper[4807]: I1202 20:01:28.972077 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079" exitCode=0 Dec 02 20:01:28 crc kubenswrapper[4807]: I1202 20:01:28.979425 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079"} Dec 02 20:01:28 crc kubenswrapper[4807]: I1202 20:01:28.994986 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ncdvx" podStartSLOduration=4.514687644 podStartE2EDuration="1m19.994967932s" podCreationTimestamp="2025-12-02 20:00:09 +0000 UTC" firstStartedPulling="2025-12-02 20:00:11.916848942 +0000 UTC m=+147.217756427" lastFinishedPulling="2025-12-02 20:01:27.39712922 +0000 UTC m=+222.698036715" observedRunningTime="2025-12-02 20:01:28.994609719 +0000 UTC m=+224.295517214" watchObservedRunningTime="2025-12-02 20:01:28.994967932 +0000 UTC m=+224.295875427" Dec 02 20:01:29 crc kubenswrapper[4807]: I1202 20:01:29.095153 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:01:29 crc kubenswrapper[4807]: I1202 20:01:29.095203 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:01:29 crc kubenswrapper[4807]: I1202 20:01:29.186249 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:01:29 crc kubenswrapper[4807]: I1202 20:01:29.186296 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:01:29 crc kubenswrapper[4807]: I1202 20:01:29.304478 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:01:29 crc kubenswrapper[4807]: I1202 20:01:29.305907 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:01:29 crc kubenswrapper[4807]: I1202 20:01:29.555233 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:01:29 crc kubenswrapper[4807]: I1202 20:01:29.555611 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:01:29 crc kubenswrapper[4807]: I1202 20:01:29.978756 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"921e128bf7c751c48abace6e4b39de2d8b371ebc3478e06c957ad3086674020b"} Dec 02 20:01:29 crc kubenswrapper[4807]: I1202 20:01:29.981043 4807 generic.go:334] "Generic (PLEG): container finished" podID="807e5ccc-6b8a-4be7-8b8c-34888237e22d" containerID="717fe89029fe7c4a025c42dd1001669dbc241143a589297bd283af161cd090a6" exitCode=0 Dec 02 20:01:29 crc kubenswrapper[4807]: I1202 20:01:29.981158 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrpb5" event={"ID":"807e5ccc-6b8a-4be7-8b8c-34888237e22d","Type":"ContainerDied","Data":"717fe89029fe7c4a025c42dd1001669dbc241143a589297bd283af161cd090a6"} Dec 02 20:01:30 crc kubenswrapper[4807]: I1202 20:01:30.177317 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:01:30 crc kubenswrapper[4807]: I1202 20:01:30.987558 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mxjq" event={"ID":"40b1a65e-4886-466b-81ca-387c4b36310a","Type":"ContainerStarted","Data":"b2f9b345ea1b984bcaf053e4f9d206fc1a16460613e5d8b24745a5a58c2af6a9"} Dec 02 20:01:30 crc kubenswrapper[4807]: I1202 20:01:30.991442 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrpb5" event={"ID":"807e5ccc-6b8a-4be7-8b8c-34888237e22d","Type":"ContainerStarted","Data":"fc27b3f558a0c9318544f2eb21da71d9c491bd96f7a03ca3ecea260cb61cbacc"} Dec 02 20:01:30 crc kubenswrapper[4807]: I1202 20:01:30.993617 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hz9f" event={"ID":"49f3d78d-194b-4b23-9bbc-cd89cd6fe402","Type":"ContainerStarted","Data":"554165af4632877186f0983b67719a83ac22252033adda26f83d03dd632c0bc5"} Dec 02 20:01:31 crc kubenswrapper[4807]: I1202 20:01:31.050022 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hrpb5" podStartSLOduration=4.227908012 podStartE2EDuration="1m23.049996016s" podCreationTimestamp="2025-12-02 20:00:08 +0000 UTC" firstStartedPulling="2025-12-02 20:00:11.929218563 +0000 UTC m=+147.230126048" lastFinishedPulling="2025-12-02 20:01:30.751306557 +0000 UTC m=+226.052214052" observedRunningTime="2025-12-02 20:01:31.047080141 +0000 UTC m=+226.347987646" watchObservedRunningTime="2025-12-02 20:01:31.049996016 +0000 UTC m=+226.350903511" Dec 02 20:01:31 crc kubenswrapper[4807]: I1202 20:01:31.172151 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:01:31 crc kubenswrapper[4807]: I1202 20:01:31.176793 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:01:31 crc kubenswrapper[4807]: I1202 20:01:31.241707 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:01:31 crc kubenswrapper[4807]: I1202 20:01:31.512831 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:01:31 crc kubenswrapper[4807]: I1202 20:01:31.517684 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:01:31 crc kubenswrapper[4807]: I1202 20:01:31.590066 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:01:32 crc kubenswrapper[4807]: I1202 20:01:32.077402 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:01:32 crc kubenswrapper[4807]: I1202 20:01:32.119108 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:01:33 crc kubenswrapper[4807]: I1202 20:01:33.012960 4807 generic.go:334] "Generic (PLEG): container finished" podID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" containerID="554165af4632877186f0983b67719a83ac22252033adda26f83d03dd632c0bc5" exitCode=0 Dec 02 20:01:33 crc kubenswrapper[4807]: I1202 20:01:33.013080 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hz9f" event={"ID":"49f3d78d-194b-4b23-9bbc-cd89cd6fe402","Type":"ContainerDied","Data":"554165af4632877186f0983b67719a83ac22252033adda26f83d03dd632c0bc5"} Dec 02 20:01:33 crc kubenswrapper[4807]: I1202 20:01:33.015297 4807 generic.go:334] "Generic (PLEG): container finished" podID="40b1a65e-4886-466b-81ca-387c4b36310a" containerID="b2f9b345ea1b984bcaf053e4f9d206fc1a16460613e5d8b24745a5a58c2af6a9" exitCode=0 Dec 02 20:01:33 crc kubenswrapper[4807]: I1202 20:01:33.015363 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mxjq" event={"ID":"40b1a65e-4886-466b-81ca-387c4b36310a","Type":"ContainerDied","Data":"b2f9b345ea1b984bcaf053e4f9d206fc1a16460613e5d8b24745a5a58c2af6a9"} Dec 02 20:01:36 crc kubenswrapper[4807]: I1202 20:01:36.041246 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mxjq" event={"ID":"40b1a65e-4886-466b-81ca-387c4b36310a","Type":"ContainerStarted","Data":"2669c4507c7283518d0553b314b82ae4b4683466028c8d6e73a2565afa58429f"} Dec 02 20:01:36 crc kubenswrapper[4807]: I1202 20:01:36.046335 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hz9f" event={"ID":"49f3d78d-194b-4b23-9bbc-cd89cd6fe402","Type":"ContainerStarted","Data":"297267db2def15a8785d534b49caa2b18ca338f6a0810ab8696a684049b528ac"} Dec 02 20:01:36 crc kubenswrapper[4807]: I1202 20:01:36.058936 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9mxjq" podStartSLOduration=4.538205789 podStartE2EDuration="1m25.058917003s" podCreationTimestamp="2025-12-02 20:00:11 +0000 UTC" firstStartedPulling="2025-12-02 20:00:14.177866944 +0000 UTC m=+149.478774439" lastFinishedPulling="2025-12-02 20:01:34.698578158 +0000 UTC m=+229.999485653" observedRunningTime="2025-12-02 20:01:36.058517559 +0000 UTC m=+231.359425084" watchObservedRunningTime="2025-12-02 20:01:36.058917003 +0000 UTC m=+231.359824498" Dec 02 20:01:36 crc kubenswrapper[4807]: I1202 20:01:36.080536 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7hz9f" podStartSLOduration=2.731869384 podStartE2EDuration="1m24.08051078s" podCreationTimestamp="2025-12-02 20:00:12 +0000 UTC" firstStartedPulling="2025-12-02 20:00:14.175288514 +0000 UTC m=+149.476196019" lastFinishedPulling="2025-12-02 20:01:35.52392992 +0000 UTC m=+230.824837415" observedRunningTime="2025-12-02 20:01:36.079579857 +0000 UTC m=+231.380487362" watchObservedRunningTime="2025-12-02 20:01:36.08051078 +0000 UTC m=+231.381418275" Dec 02 20:01:36 crc kubenswrapper[4807]: I1202 20:01:36.235397 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5smb"] Dec 02 20:01:36 crc kubenswrapper[4807]: I1202 20:01:36.235614 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k5smb" podUID="2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" containerName="registry-server" containerID="cri-o://c86a477c8cb29920e138db1ff3166f0486b552f6dcf60f8ca9d800c0935d4de7" gracePeriod=2 Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.072759 4807 generic.go:334] "Generic (PLEG): container finished" podID="2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" containerID="c86a477c8cb29920e138db1ff3166f0486b552f6dcf60f8ca9d800c0935d4de7" exitCode=0 Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.072852 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5smb" event={"ID":"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58","Type":"ContainerDied","Data":"c86a477c8cb29920e138db1ff3166f0486b552f6dcf60f8ca9d800c0935d4de7"} Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.139706 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.225040 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.349487 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.349896 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.389667 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.592471 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.653412 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.788043 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-utilities\") pod \"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58\" (UID: \"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58\") " Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.788147 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-catalog-content\") pod \"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58\" (UID: \"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58\") " Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.788313 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8vjr\" (UniqueName: \"kubernetes.io/projected/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-kube-api-access-r8vjr\") pod \"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58\" (UID: \"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58\") " Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.789120 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-utilities" (OuterVolumeSpecName: "utilities") pod "2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" (UID: "2f05f7b5-9f46-48c9-9b4f-a2497e87cc58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.800128 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-kube-api-access-r8vjr" (OuterVolumeSpecName: "kube-api-access-r8vjr") pod "2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" (UID: "2f05f7b5-9f46-48c9-9b4f-a2497e87cc58"). InnerVolumeSpecName "kube-api-access-r8vjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.810834 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" (UID: "2f05f7b5-9f46-48c9-9b4f-a2497e87cc58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.890279 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8vjr\" (UniqueName: \"kubernetes.io/projected/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-kube-api-access-r8vjr\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.890337 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:39 crc kubenswrapper[4807]: I1202 20:01:39.890351 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:40 crc kubenswrapper[4807]: I1202 20:01:40.080122 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k5smb" event={"ID":"2f05f7b5-9f46-48c9-9b4f-a2497e87cc58","Type":"ContainerDied","Data":"59f7ddd96a3472c0a126a7ba38ab97801836179cc2d4898c7c036b137ff2df0e"} Dec 02 20:01:40 crc kubenswrapper[4807]: I1202 20:01:40.080188 4807 scope.go:117] "RemoveContainer" containerID="c86a477c8cb29920e138db1ff3166f0486b552f6dcf60f8ca9d800c0935d4de7" Dec 02 20:01:40 crc kubenswrapper[4807]: I1202 20:01:40.080207 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k5smb" Dec 02 20:01:40 crc kubenswrapper[4807]: I1202 20:01:40.098324 4807 scope.go:117] "RemoveContainer" containerID="b29a477f800e87ca889e463d588bf066e1f90268995a2bb41a6dc4f93f3fbdc7" Dec 02 20:01:40 crc kubenswrapper[4807]: I1202 20:01:40.107916 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5smb"] Dec 02 20:01:40 crc kubenswrapper[4807]: I1202 20:01:40.113674 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k5smb"] Dec 02 20:01:40 crc kubenswrapper[4807]: I1202 20:01:40.127534 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:01:40 crc kubenswrapper[4807]: I1202 20:01:40.134988 4807 scope.go:117] "RemoveContainer" containerID="a9419ca6b706d1cfeeb1b8fdb8356a500420d02e25f987c6bde2b605e23846d1" Dec 02 20:01:40 crc kubenswrapper[4807]: I1202 20:01:40.978492 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" path="/var/lib/kubelet/pods/2f05f7b5-9f46-48c9-9b4f-a2497e87cc58/volumes" Dec 02 20:01:41 crc kubenswrapper[4807]: I1202 20:01:41.435793 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hrpb5"] Dec 02 20:01:42 crc kubenswrapper[4807]: I1202 20:01:42.035371 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ncdvx"] Dec 02 20:01:42 crc kubenswrapper[4807]: I1202 20:01:42.035766 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ncdvx" podUID="3d555627-e360-46ad-b6d9-ec1cc3ce1d68" containerName="registry-server" containerID="cri-o://f2c1c5cc22244c0809370cfa96ec52ccaf2f98306881848624f37b4f661683a2" gracePeriod=2 Dec 02 20:01:42 crc kubenswrapper[4807]: I1202 20:01:42.123950 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:01:42 crc kubenswrapper[4807]: I1202 20:01:42.124033 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:01:42 crc kubenswrapper[4807]: I1202 20:01:42.184142 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:01:42 crc kubenswrapper[4807]: I1202 20:01:42.670287 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:01:42 crc kubenswrapper[4807]: I1202 20:01:42.670364 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:01:42 crc kubenswrapper[4807]: I1202 20:01:42.717346 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:01:43 crc kubenswrapper[4807]: I1202 20:01:43.098222 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hrpb5" podUID="807e5ccc-6b8a-4be7-8b8c-34888237e22d" containerName="registry-server" containerID="cri-o://fc27b3f558a0c9318544f2eb21da71d9c491bd96f7a03ca3ecea260cb61cbacc" gracePeriod=2 Dec 02 20:01:43 crc kubenswrapper[4807]: I1202 20:01:43.142286 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:01:43 crc kubenswrapper[4807]: I1202 20:01:43.150943 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:01:44 crc kubenswrapper[4807]: I1202 20:01:44.108387 4807 generic.go:334] "Generic (PLEG): container finished" podID="807e5ccc-6b8a-4be7-8b8c-34888237e22d" containerID="fc27b3f558a0c9318544f2eb21da71d9c491bd96f7a03ca3ecea260cb61cbacc" exitCode=0 Dec 02 20:01:44 crc kubenswrapper[4807]: I1202 20:01:44.108458 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrpb5" event={"ID":"807e5ccc-6b8a-4be7-8b8c-34888237e22d","Type":"ContainerDied","Data":"fc27b3f558a0c9318544f2eb21da71d9c491bd96f7a03ca3ecea260cb61cbacc"} Dec 02 20:01:44 crc kubenswrapper[4807]: I1202 20:01:44.112151 4807 generic.go:334] "Generic (PLEG): container finished" podID="3d555627-e360-46ad-b6d9-ec1cc3ce1d68" containerID="f2c1c5cc22244c0809370cfa96ec52ccaf2f98306881848624f37b4f661683a2" exitCode=0 Dec 02 20:01:44 crc kubenswrapper[4807]: I1202 20:01:44.112243 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncdvx" event={"ID":"3d555627-e360-46ad-b6d9-ec1cc3ce1d68","Type":"ContainerDied","Data":"f2c1c5cc22244c0809370cfa96ec52ccaf2f98306881848624f37b4f661683a2"} Dec 02 20:01:46 crc kubenswrapper[4807]: I1202 20:01:46.842285 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7hz9f"] Dec 02 20:01:46 crc kubenswrapper[4807]: I1202 20:01:46.843002 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7hz9f" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" containerName="registry-server" containerID="cri-o://297267db2def15a8785d534b49caa2b18ca338f6a0810ab8696a684049b528ac" gracePeriod=2 Dec 02 20:01:46 crc kubenswrapper[4807]: I1202 20:01:46.887028 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:01:46 crc kubenswrapper[4807]: I1202 20:01:46.992256 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz2sr\" (UniqueName: \"kubernetes.io/projected/807e5ccc-6b8a-4be7-8b8c-34888237e22d-kube-api-access-mz2sr\") pod \"807e5ccc-6b8a-4be7-8b8c-34888237e22d\" (UID: \"807e5ccc-6b8a-4be7-8b8c-34888237e22d\") " Dec 02 20:01:46 crc kubenswrapper[4807]: I1202 20:01:46.992321 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/807e5ccc-6b8a-4be7-8b8c-34888237e22d-catalog-content\") pod \"807e5ccc-6b8a-4be7-8b8c-34888237e22d\" (UID: \"807e5ccc-6b8a-4be7-8b8c-34888237e22d\") " Dec 02 20:01:46 crc kubenswrapper[4807]: I1202 20:01:46.992342 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/807e5ccc-6b8a-4be7-8b8c-34888237e22d-utilities\") pod \"807e5ccc-6b8a-4be7-8b8c-34888237e22d\" (UID: \"807e5ccc-6b8a-4be7-8b8c-34888237e22d\") " Dec 02 20:01:46 crc kubenswrapper[4807]: I1202 20:01:46.993813 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/807e5ccc-6b8a-4be7-8b8c-34888237e22d-utilities" (OuterVolumeSpecName: "utilities") pod "807e5ccc-6b8a-4be7-8b8c-34888237e22d" (UID: "807e5ccc-6b8a-4be7-8b8c-34888237e22d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:01:46 crc kubenswrapper[4807]: I1202 20:01:46.997598 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/807e5ccc-6b8a-4be7-8b8c-34888237e22d-kube-api-access-mz2sr" (OuterVolumeSpecName: "kube-api-access-mz2sr") pod "807e5ccc-6b8a-4be7-8b8c-34888237e22d" (UID: "807e5ccc-6b8a-4be7-8b8c-34888237e22d"). InnerVolumeSpecName "kube-api-access-mz2sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.059610 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.089213 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/807e5ccc-6b8a-4be7-8b8c-34888237e22d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "807e5ccc-6b8a-4be7-8b8c-34888237e22d" (UID: "807e5ccc-6b8a-4be7-8b8c-34888237e22d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.094020 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz2sr\" (UniqueName: \"kubernetes.io/projected/807e5ccc-6b8a-4be7-8b8c-34888237e22d-kube-api-access-mz2sr\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.094084 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/807e5ccc-6b8a-4be7-8b8c-34888237e22d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.094103 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/807e5ccc-6b8a-4be7-8b8c-34888237e22d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.137400 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hrpb5" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.137421 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hrpb5" event={"ID":"807e5ccc-6b8a-4be7-8b8c-34888237e22d","Type":"ContainerDied","Data":"676e6340d36ad1769d6d795830ec4b0950c123179c0d032568241a2e167f02ad"} Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.137508 4807 scope.go:117] "RemoveContainer" containerID="fc27b3f558a0c9318544f2eb21da71d9c491bd96f7a03ca3ecea260cb61cbacc" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.142272 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ncdvx" event={"ID":"3d555627-e360-46ad-b6d9-ec1cc3ce1d68","Type":"ContainerDied","Data":"d5442ece930673a860856c148c3302cb31832c9594f9beb2b6afdddaef57e6ea"} Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.142321 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ncdvx" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.167099 4807 scope.go:117] "RemoveContainer" containerID="717fe89029fe7c4a025c42dd1001669dbc241143a589297bd283af161cd090a6" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.173173 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hrpb5"] Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.183656 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hrpb5"] Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.190423 4807 scope.go:117] "RemoveContainer" containerID="941459d432d6503c4a09073baa42647086df73febd836a0a79e4be11e2601240" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.195262 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95trv\" (UniqueName: \"kubernetes.io/projected/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-kube-api-access-95trv\") pod \"3d555627-e360-46ad-b6d9-ec1cc3ce1d68\" (UID: \"3d555627-e360-46ad-b6d9-ec1cc3ce1d68\") " Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.195410 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-utilities\") pod \"3d555627-e360-46ad-b6d9-ec1cc3ce1d68\" (UID: \"3d555627-e360-46ad-b6d9-ec1cc3ce1d68\") " Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.195573 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-catalog-content\") pod \"3d555627-e360-46ad-b6d9-ec1cc3ce1d68\" (UID: \"3d555627-e360-46ad-b6d9-ec1cc3ce1d68\") " Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.196402 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-utilities" (OuterVolumeSpecName: "utilities") pod "3d555627-e360-46ad-b6d9-ec1cc3ce1d68" (UID: "3d555627-e360-46ad-b6d9-ec1cc3ce1d68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.199874 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-kube-api-access-95trv" (OuterVolumeSpecName: "kube-api-access-95trv") pod "3d555627-e360-46ad-b6d9-ec1cc3ce1d68" (UID: "3d555627-e360-46ad-b6d9-ec1cc3ce1d68"). InnerVolumeSpecName "kube-api-access-95trv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.208742 4807 scope.go:117] "RemoveContainer" containerID="f2c1c5cc22244c0809370cfa96ec52ccaf2f98306881848624f37b4f661683a2" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.221627 4807 scope.go:117] "RemoveContainer" containerID="bbc2502ac4675046fa89b9b5f7f0bb6f9b0acf5b217ee938af6b833ae6778b51" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.236814 4807 scope.go:117] "RemoveContainer" containerID="97297ca81079be25d529bd22530880619781b1f002a7a085492c30343c0d2ad5" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.246743 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d555627-e360-46ad-b6d9-ec1cc3ce1d68" (UID: "3d555627-e360-46ad-b6d9-ec1cc3ce1d68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.297274 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95trv\" (UniqueName: \"kubernetes.io/projected/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-kube-api-access-95trv\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.297313 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.297324 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d555627-e360-46ad-b6d9-ec1cc3ce1d68-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.468063 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ncdvx"] Dec 02 20:01:47 crc kubenswrapper[4807]: I1202 20:01:47.471176 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ncdvx"] Dec 02 20:01:47 crc kubenswrapper[4807]: E1202 20:01:47.599166 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49f3d78d_194b_4b23_9bbc_cd89cd6fe402.slice/crio-297267db2def15a8785d534b49caa2b18ca338f6a0810ab8696a684049b528ac.scope\": RecentStats: unable to find data in memory cache]" Dec 02 20:01:48 crc kubenswrapper[4807]: I1202 20:01:48.985858 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d555627-e360-46ad-b6d9-ec1cc3ce1d68" path="/var/lib/kubelet/pods/3d555627-e360-46ad-b6d9-ec1cc3ce1d68/volumes" Dec 02 20:01:48 crc kubenswrapper[4807]: I1202 20:01:48.986995 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="807e5ccc-6b8a-4be7-8b8c-34888237e22d" path="/var/lib/kubelet/pods/807e5ccc-6b8a-4be7-8b8c-34888237e22d/volumes" Dec 02 20:01:48 crc kubenswrapper[4807]: I1202 20:01:48.999184 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" containerName="oauth-openshift" containerID="cri-o://2f492293b01fc826df1b80ef2ebad2fa6f0a0f7a9d2fb144970d08b059c6bd83" gracePeriod=15 Dec 02 20:01:49 crc kubenswrapper[4807]: I1202 20:01:49.162455 4807 generic.go:334] "Generic (PLEG): container finished" podID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" containerID="297267db2def15a8785d534b49caa2b18ca338f6a0810ab8696a684049b528ac" exitCode=0 Dec 02 20:01:49 crc kubenswrapper[4807]: I1202 20:01:49.162574 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hz9f" event={"ID":"49f3d78d-194b-4b23-9bbc-cd89cd6fe402","Type":"ContainerDied","Data":"297267db2def15a8785d534b49caa2b18ca338f6a0810ab8696a684049b528ac"} Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.592390 4807 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.595879 4807 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.596234 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d555627-e360-46ad-b6d9-ec1cc3ce1d68" containerName="extract-content" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.596258 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d555627-e360-46ad-b6d9-ec1cc3ce1d68" containerName="extract-content" Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.596273 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d555627-e360-46ad-b6d9-ec1cc3ce1d68" containerName="extract-utilities" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.596281 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d555627-e360-46ad-b6d9-ec1cc3ce1d68" containerName="extract-utilities" Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.596291 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" containerName="extract-utilities" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.596299 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" containerName="extract-utilities" Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.596308 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d555627-e360-46ad-b6d9-ec1cc3ce1d68" containerName="registry-server" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.596315 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d555627-e360-46ad-b6d9-ec1cc3ce1d68" containerName="registry-server" Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.596326 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" containerName="registry-server" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.596335 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" containerName="registry-server" Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.596351 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807e5ccc-6b8a-4be7-8b8c-34888237e22d" containerName="registry-server" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.596358 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="807e5ccc-6b8a-4be7-8b8c-34888237e22d" containerName="registry-server" Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.596371 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a4a8446-37af-4473-9200-aafe3b5cefd9" containerName="pruner" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.596380 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a4a8446-37af-4473-9200-aafe3b5cefd9" containerName="pruner" Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.596389 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" containerName="extract-content" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.596396 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" containerName="extract-content" Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.596411 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807e5ccc-6b8a-4be7-8b8c-34888237e22d" containerName="extract-content" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.596419 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="807e5ccc-6b8a-4be7-8b8c-34888237e22d" containerName="extract-content" Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.596433 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807e5ccc-6b8a-4be7-8b8c-34888237e22d" containerName="extract-utilities" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.596440 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="807e5ccc-6b8a-4be7-8b8c-34888237e22d" containerName="extract-utilities" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.596550 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="807e5ccc-6b8a-4be7-8b8c-34888237e22d" containerName="registry-server" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.596568 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a4a8446-37af-4473-9200-aafe3b5cefd9" containerName="pruner" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.596584 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f05f7b5-9f46-48c9-9b4f-a2497e87cc58" containerName="registry-server" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.596595 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d555627-e360-46ad-b6d9-ec1cc3ce1d68" containerName="registry-server" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.597058 4807 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.597394 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311" gracePeriod=15 Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.597481 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f" gracePeriod=15 Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.597568 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015" gracePeriod=15 Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.597480 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e" gracePeriod=15 Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.597573 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe" gracePeriod=15 Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.597983 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.599263 4807 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.599432 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.599452 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.599462 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.599469 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.599478 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.599484 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.599492 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.599498 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.599507 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.599514 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.599522 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.599528 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 20:01:51 crc kubenswrapper[4807]: E1202 20:01:51.599536 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.599542 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.599656 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.599672 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.599686 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.599698 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.599710 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.599954 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.661165 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.661209 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.661267 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.661377 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.661478 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.661712 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.661821 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.661885 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.681253 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.763272 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.763323 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.763358 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.763383 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.763410 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.763429 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.763430 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.763446 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.763466 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.763500 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.763502 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.763534 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.763536 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.763582 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.763581 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.763678 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.769203 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.773787 4807 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.865233 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-catalog-content\") pod \"49f3d78d-194b-4b23-9bbc-cd89cd6fe402\" (UID: \"49f3d78d-194b-4b23-9bbc-cd89cd6fe402\") " Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.865294 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg7fj\" (UniqueName: \"kubernetes.io/projected/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-kube-api-access-rg7fj\") pod \"49f3d78d-194b-4b23-9bbc-cd89cd6fe402\" (UID: \"49f3d78d-194b-4b23-9bbc-cd89cd6fe402\") " Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.865373 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-utilities\") pod \"49f3d78d-194b-4b23-9bbc-cd89cd6fe402\" (UID: \"49f3d78d-194b-4b23-9bbc-cd89cd6fe402\") " Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.866683 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-utilities" (OuterVolumeSpecName: "utilities") pod "49f3d78d-194b-4b23-9bbc-cd89cd6fe402" (UID: "49f3d78d-194b-4b23-9bbc-cd89cd6fe402"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.866937 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.870816 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-kube-api-access-rg7fj" (OuterVolumeSpecName: "kube-api-access-rg7fj") pod "49f3d78d-194b-4b23-9bbc-cd89cd6fe402" (UID: "49f3d78d-194b-4b23-9bbc-cd89cd6fe402"). InnerVolumeSpecName "kube-api-access-rg7fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.967993 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:01:51 crc kubenswrapper[4807]: I1202 20:01:51.968363 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg7fj\" (UniqueName: \"kubernetes.io/projected/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-kube-api-access-rg7fj\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:51 crc kubenswrapper[4807]: W1202 20:01:51.985713 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-810ddb87811f8734c2b8c2b093707d01a10ab601acb3d5c4e360b10d65378789 WatchSource:0}: Error finding container 810ddb87811f8734c2b8c2b093707d01a10ab601acb3d5c4e360b10d65378789: Status 404 returned error can't find the container with id 810ddb87811f8734c2b8c2b093707d01a10ab601acb3d5c4e360b10d65378789 Dec 02 20:01:52 crc kubenswrapper[4807]: I1202 20:01:52.184575 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"810ddb87811f8734c2b8c2b093707d01a10ab601acb3d5c4e360b10d65378789"} Dec 02 20:01:52 crc kubenswrapper[4807]: I1202 20:01:52.187766 4807 generic.go:334] "Generic (PLEG): container finished" podID="30d55e05-d66f-496a-b4e1-fb6deb38895f" containerID="2f492293b01fc826df1b80ef2ebad2fa6f0a0f7a9d2fb144970d08b059c6bd83" exitCode=0 Dec 02 20:01:52 crc kubenswrapper[4807]: I1202 20:01:52.187877 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" event={"ID":"30d55e05-d66f-496a-b4e1-fb6deb38895f","Type":"ContainerDied","Data":"2f492293b01fc826df1b80ef2ebad2fa6f0a0f7a9d2fb144970d08b059c6bd83"} Dec 02 20:01:52 crc kubenswrapper[4807]: I1202 20:01:52.190869 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7hz9f" event={"ID":"49f3d78d-194b-4b23-9bbc-cd89cd6fe402","Type":"ContainerDied","Data":"4c03858aa1897863dfef8d7836b88db0d1e8ac8f3c9b75b93486db9df8e4cff4"} Dec 02 20:01:52 crc kubenswrapper[4807]: I1202 20:01:52.190951 4807 scope.go:117] "RemoveContainer" containerID="297267db2def15a8785d534b49caa2b18ca338f6a0810ab8696a684049b528ac" Dec 02 20:01:52 crc kubenswrapper[4807]: I1202 20:01:52.190964 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7hz9f" Dec 02 20:01:52 crc kubenswrapper[4807]: I1202 20:01:52.192285 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:52 crc kubenswrapper[4807]: I1202 20:01:52.206326 4807 scope.go:117] "RemoveContainer" containerID="554165af4632877186f0983b67719a83ac22252033adda26f83d03dd632c0bc5" Dec 02 20:01:52 crc kubenswrapper[4807]: I1202 20:01:52.225798 4807 scope.go:117] "RemoveContainer" containerID="6343605f7e88645c656263b491054c069152317c128b3ba179edce8a6709f9cb" Dec 02 20:01:52 crc kubenswrapper[4807]: I1202 20:01:52.803660 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49f3d78d-194b-4b23-9bbc-cd89cd6fe402" (UID: "49f3d78d-194b-4b23-9bbc-cd89cd6fe402"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:01:52 crc kubenswrapper[4807]: I1202 20:01:52.881545 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49f3d78d-194b-4b23-9bbc-cd89cd6fe402-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:53 crc kubenswrapper[4807]: I1202 20:01:53.096168 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:53 crc kubenswrapper[4807]: I1202 20:01:53.202362 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 20:01:53 crc kubenswrapper[4807]: I1202 20:01:53.204384 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 20:01:53 crc kubenswrapper[4807]: I1202 20:01:53.205747 4807 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f" exitCode=0 Dec 02 20:01:53 crc kubenswrapper[4807]: I1202 20:01:53.205789 4807 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe" exitCode=2 Dec 02 20:01:53 crc kubenswrapper[4807]: I1202 20:01:53.205908 4807 scope.go:117] "RemoveContainer" containerID="17b77ea5095ed3e9521f0d8f2ca508bb5416d978203f2a189d1fb49a4d59e40a" Dec 02 20:01:54 crc kubenswrapper[4807]: I1202 20:01:54.202472 4807 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-v4lp2 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.22:6443/healthz\": dial tcp 10.217.0.22:6443: connect: connection refused" start-of-body= Dec 02 20:01:54 crc kubenswrapper[4807]: I1202 20:01:54.202856 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.22:6443/healthz\": dial tcp 10.217.0.22:6443: connect: connection refused" Dec 02 20:01:54 crc kubenswrapper[4807]: E1202 20:01:54.203387 4807 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/events/oauth-openshift-558db77b4-v4lp2.187d7e668bbdd938\": dial tcp 38.102.83.36:6443: connect: connection refused" event=< Dec 02 20:01:54 crc kubenswrapper[4807]: &Event{ObjectMeta:{oauth-openshift-558db77b4-v4lp2.187d7e668bbdd938 openshift-authentication 28093 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-authentication,Name:oauth-openshift-558db77b4-v4lp2,UID:30d55e05-d66f-496a-b4e1-fb6deb38895f,APIVersion:v1,ResourceVersion:27221,FieldPath:spec.containers{oauth-openshift},},Reason:ProbeError,Message:Readiness probe error: Get "https://10.217.0.22:6443/healthz": dial tcp 10.217.0.22:6443: connect: connection refused Dec 02 20:01:54 crc kubenswrapper[4807]: body: Dec 02 20:01:54 crc kubenswrapper[4807]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 20:00:07 +0000 UTC,LastTimestamp:2025-12-02 20:01:54.20283022 +0000 UTC m=+249.503737725,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Dec 02 20:01:54 crc kubenswrapper[4807]: > Dec 02 20:01:54 crc kubenswrapper[4807]: I1202 20:01:54.215461 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 20:01:54 crc kubenswrapper[4807]: I1202 20:01:54.216131 4807 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e" exitCode=0 Dec 02 20:01:54 crc kubenswrapper[4807]: I1202 20:01:54.975597 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.119771 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.120204 4807 status_manager.go:851] "Failed to get status for pod" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-v4lp2\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.120492 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.215606 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-serving-cert\") pod \"30d55e05-d66f-496a-b4e1-fb6deb38895f\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.215957 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc5kl\" (UniqueName: \"kubernetes.io/projected/30d55e05-d66f-496a-b4e1-fb6deb38895f-kube-api-access-wc5kl\") pod \"30d55e05-d66f-496a-b4e1-fb6deb38895f\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.215976 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-session\") pod \"30d55e05-d66f-496a-b4e1-fb6deb38895f\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.215994 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-ocp-branding-template\") pod \"30d55e05-d66f-496a-b4e1-fb6deb38895f\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.216011 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-trusted-ca-bundle\") pod \"30d55e05-d66f-496a-b4e1-fb6deb38895f\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.216036 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-service-ca\") pod \"30d55e05-d66f-496a-b4e1-fb6deb38895f\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.216100 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-audit-policies\") pod \"30d55e05-d66f-496a-b4e1-fb6deb38895f\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.216135 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-cliconfig\") pod \"30d55e05-d66f-496a-b4e1-fb6deb38895f\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.216151 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-idp-0-file-data\") pod \"30d55e05-d66f-496a-b4e1-fb6deb38895f\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.216185 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-error\") pod \"30d55e05-d66f-496a-b4e1-fb6deb38895f\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.216213 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-login\") pod \"30d55e05-d66f-496a-b4e1-fb6deb38895f\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.216253 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/30d55e05-d66f-496a-b4e1-fb6deb38895f-audit-dir\") pod \"30d55e05-d66f-496a-b4e1-fb6deb38895f\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.216275 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-router-certs\") pod \"30d55e05-d66f-496a-b4e1-fb6deb38895f\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.216305 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-provider-selection\") pod \"30d55e05-d66f-496a-b4e1-fb6deb38895f\" (UID: \"30d55e05-d66f-496a-b4e1-fb6deb38895f\") " Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.218252 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/30d55e05-d66f-496a-b4e1-fb6deb38895f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "30d55e05-d66f-496a-b4e1-fb6deb38895f" (UID: "30d55e05-d66f-496a-b4e1-fb6deb38895f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.218276 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "30d55e05-d66f-496a-b4e1-fb6deb38895f" (UID: "30d55e05-d66f-496a-b4e1-fb6deb38895f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.225102 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "30d55e05-d66f-496a-b4e1-fb6deb38895f" (UID: "30d55e05-d66f-496a-b4e1-fb6deb38895f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.227249 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "30d55e05-d66f-496a-b4e1-fb6deb38895f" (UID: "30d55e05-d66f-496a-b4e1-fb6deb38895f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.234780 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "30d55e05-d66f-496a-b4e1-fb6deb38895f" (UID: "30d55e05-d66f-496a-b4e1-fb6deb38895f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.234891 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30d55e05-d66f-496a-b4e1-fb6deb38895f-kube-api-access-wc5kl" (OuterVolumeSpecName: "kube-api-access-wc5kl") pod "30d55e05-d66f-496a-b4e1-fb6deb38895f" (UID: "30d55e05-d66f-496a-b4e1-fb6deb38895f"). InnerVolumeSpecName "kube-api-access-wc5kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.235028 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "30d55e05-d66f-496a-b4e1-fb6deb38895f" (UID: "30d55e05-d66f-496a-b4e1-fb6deb38895f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.235072 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "30d55e05-d66f-496a-b4e1-fb6deb38895f" (UID: "30d55e05-d66f-496a-b4e1-fb6deb38895f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.236234 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "30d55e05-d66f-496a-b4e1-fb6deb38895f" (UID: "30d55e05-d66f-496a-b4e1-fb6deb38895f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.238855 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "30d55e05-d66f-496a-b4e1-fb6deb38895f" (UID: "30d55e05-d66f-496a-b4e1-fb6deb38895f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.239903 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "30d55e05-d66f-496a-b4e1-fb6deb38895f" (UID: "30d55e05-d66f-496a-b4e1-fb6deb38895f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.242260 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "30d55e05-d66f-496a-b4e1-fb6deb38895f" (UID: "30d55e05-d66f-496a-b4e1-fb6deb38895f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.242756 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.243556 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "30d55e05-d66f-496a-b4e1-fb6deb38895f" (UID: "30d55e05-d66f-496a-b4e1-fb6deb38895f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.247848 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "30d55e05-d66f-496a-b4e1-fb6deb38895f" (UID: "30d55e05-d66f-496a-b4e1-fb6deb38895f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.249000 4807 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015" exitCode=0 Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.257014 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"055b6824a58294033ccbf278b111975f6fc2b7526fc07a89364833c0e6505f2c"} Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.258496 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.258742 4807 status_manager.go:851] "Failed to get status for pod" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-v4lp2\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.258937 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.260408 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" event={"ID":"30d55e05-d66f-496a-b4e1-fb6deb38895f","Type":"ContainerDied","Data":"238b7757227bf360cacfa4d9fdaeaf0cbc820e629e53dab7aac55c0ce33c7899"} Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.260445 4807 scope.go:117] "RemoveContainer" containerID="2f492293b01fc826df1b80ef2ebad2fa6f0a0f7a9d2fb144970d08b059c6bd83" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.260530 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.261061 4807 status_manager.go:851] "Failed to get status for pod" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-v4lp2\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.261573 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.261856 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.264227 4807 generic.go:334] "Generic (PLEG): container finished" podID="f888ee60-73ba-421a-9262-bc59ba72820f" containerID="91ccc63f354fb4fc6d8f6167aacc7757404aea881055cfa2d5de2fc4a6325e13" exitCode=0 Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.264302 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f888ee60-73ba-421a-9262-bc59ba72820f","Type":"ContainerDied","Data":"91ccc63f354fb4fc6d8f6167aacc7757404aea881055cfa2d5de2fc4a6325e13"} Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.265101 4807 status_manager.go:851] "Failed to get status for pod" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-v4lp2\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.265380 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.265524 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.265680 4807 status_manager.go:851] "Failed to get status for pod" podUID="f888ee60-73ba-421a-9262-bc59ba72820f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.290041 4807 status_manager.go:851] "Failed to get status for pod" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-v4lp2\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.290689 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.291054 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.291430 4807 status_manager.go:851] "Failed to get status for pod" podUID="f888ee60-73ba-421a-9262-bc59ba72820f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.318492 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.318535 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.318552 4807 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.318564 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.318576 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.318586 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.318597 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.318607 4807 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/30d55e05-d66f-496a-b4e1-fb6deb38895f-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.318620 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.318631 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.318707 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.318738 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc5kl\" (UniqueName: \"kubernetes.io/projected/30d55e05-d66f-496a-b4e1-fb6deb38895f-kube-api-access-wc5kl\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.318748 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:55 crc kubenswrapper[4807]: I1202 20:01:55.318757 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/30d55e05-d66f-496a-b4e1-fb6deb38895f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.276057 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.278882 4807 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311" exitCode=0 Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.383623 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.384709 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.385760 4807 status_manager.go:851] "Failed to get status for pod" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-v4lp2\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.386772 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.387271 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.387897 4807 status_manager.go:851] "Failed to get status for pod" podUID="f888ee60-73ba-421a-9262-bc59ba72820f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.388364 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.434771 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.434810 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.434840 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.435183 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.435200 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.435221 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.536657 4807 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.536705 4807 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.536733 4807 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.574581 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.575240 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.575634 4807 status_manager.go:851] "Failed to get status for pod" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-v4lp2\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.576149 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.576404 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.576640 4807 status_manager.go:851] "Failed to get status for pod" podUID="f888ee60-73ba-421a-9262-bc59ba72820f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.637509 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f888ee60-73ba-421a-9262-bc59ba72820f-kubelet-dir\") pod \"f888ee60-73ba-421a-9262-bc59ba72820f\" (UID: \"f888ee60-73ba-421a-9262-bc59ba72820f\") " Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.637598 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f888ee60-73ba-421a-9262-bc59ba72820f-var-lock\") pod \"f888ee60-73ba-421a-9262-bc59ba72820f\" (UID: \"f888ee60-73ba-421a-9262-bc59ba72820f\") " Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.637611 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f888ee60-73ba-421a-9262-bc59ba72820f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f888ee60-73ba-421a-9262-bc59ba72820f" (UID: "f888ee60-73ba-421a-9262-bc59ba72820f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.637700 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f888ee60-73ba-421a-9262-bc59ba72820f-kube-api-access\") pod \"f888ee60-73ba-421a-9262-bc59ba72820f\" (UID: \"f888ee60-73ba-421a-9262-bc59ba72820f\") " Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.637778 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f888ee60-73ba-421a-9262-bc59ba72820f-var-lock" (OuterVolumeSpecName: "var-lock") pod "f888ee60-73ba-421a-9262-bc59ba72820f" (UID: "f888ee60-73ba-421a-9262-bc59ba72820f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.637976 4807 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f888ee60-73ba-421a-9262-bc59ba72820f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.637999 4807 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f888ee60-73ba-421a-9262-bc59ba72820f-var-lock\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.643540 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f888ee60-73ba-421a-9262-bc59ba72820f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f888ee60-73ba-421a-9262-bc59ba72820f" (UID: "f888ee60-73ba-421a-9262-bc59ba72820f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.739084 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f888ee60-73ba-421a-9262-bc59ba72820f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 20:01:56 crc kubenswrapper[4807]: I1202 20:01:56.978035 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.284842 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f888ee60-73ba-421a-9262-bc59ba72820f","Type":"ContainerDied","Data":"b8f16bfcc5228715a4951b4bc676056f8727edc89f1c5a9917924e3cecb11365"} Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.285138 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8f16bfcc5228715a4951b4bc676056f8727edc89f1c5a9917924e3cecb11365" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.284935 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.288520 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.289692 4807 scope.go:117] "RemoveContainer" containerID="1653cd413238e5cb817aa29a1add3ae267b86e2906cb744e64a76269bbee7d2f" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.289735 4807 status_manager.go:851] "Failed to get status for pod" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-v4lp2\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.289770 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.290020 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.290640 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.290993 4807 status_manager.go:851] "Failed to get status for pod" podUID="f888ee60-73ba-421a-9262-bc59ba72820f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.291566 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.292878 4807 status_manager.go:851] "Failed to get status for pod" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-v4lp2\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.293150 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.293438 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.293877 4807 status_manager.go:851] "Failed to get status for pod" podUID="f888ee60-73ba-421a-9262-bc59ba72820f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.294707 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.295054 4807 status_manager.go:851] "Failed to get status for pod" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-v4lp2\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.295435 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.296015 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.296223 4807 status_manager.go:851] "Failed to get status for pod" podUID="f888ee60-73ba-421a-9262-bc59ba72820f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.315605 4807 scope.go:117] "RemoveContainer" containerID="7745ebc08f428cb601f8ce8977301ed5fa131da34d2881a41adb1679f495c46e" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.329564 4807 scope.go:117] "RemoveContainer" containerID="876ec7319bcf163943464e60adb714b52959a5bc2e66d0a955ff111ee9b02015" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.344268 4807 scope.go:117] "RemoveContainer" containerID="5c6cfc2ae86b50717e2adc707f5652efc1b5f91f7aaff8fd1ab1ebc4e26450fe" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.365236 4807 scope.go:117] "RemoveContainer" containerID="31e23a6139edd62c072b9dfc7d5a1b33c184346ada34639118ce11c7b8c6f311" Dec 02 20:01:57 crc kubenswrapper[4807]: I1202 20:01:57.385452 4807 scope.go:117] "RemoveContainer" containerID="4b7e2c58e05f90ad762d219b7b2b00e7576da7eea09ad47c2035feb41290e982" Dec 02 20:01:57 crc kubenswrapper[4807]: E1202 20:01:57.758190 4807 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/events/oauth-openshift-558db77b4-v4lp2.187d7e668bbdd938\": dial tcp 38.102.83.36:6443: connect: connection refused" event=< Dec 02 20:01:57 crc kubenswrapper[4807]: &Event{ObjectMeta:{oauth-openshift-558db77b4-v4lp2.187d7e668bbdd938 openshift-authentication 28093 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-authentication,Name:oauth-openshift-558db77b4-v4lp2,UID:30d55e05-d66f-496a-b4e1-fb6deb38895f,APIVersion:v1,ResourceVersion:27221,FieldPath:spec.containers{oauth-openshift},},Reason:ProbeError,Message:Readiness probe error: Get "https://10.217.0.22:6443/healthz": dial tcp 10.217.0.22:6443: connect: connection refused Dec 02 20:01:57 crc kubenswrapper[4807]: body: Dec 02 20:01:57 crc kubenswrapper[4807]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 20:00:07 +0000 UTC,LastTimestamp:2025-12-02 20:01:54.20283022 +0000 UTC m=+249.503737725,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Dec 02 20:01:57 crc kubenswrapper[4807]: > Dec 02 20:01:59 crc kubenswrapper[4807]: E1202 20:01:59.424497 4807 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:59 crc kubenswrapper[4807]: E1202 20:01:59.426838 4807 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:59 crc kubenswrapper[4807]: E1202 20:01:59.427156 4807 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:59 crc kubenswrapper[4807]: E1202 20:01:59.427592 4807 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:59 crc kubenswrapper[4807]: E1202 20:01:59.427875 4807 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:01:59 crc kubenswrapper[4807]: I1202 20:01:59.427907 4807 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 02 20:01:59 crc kubenswrapper[4807]: E1202 20:01:59.428080 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="200ms" Dec 02 20:01:59 crc kubenswrapper[4807]: E1202 20:01:59.629172 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="400ms" Dec 02 20:02:00 crc kubenswrapper[4807]: E1202 20:02:00.030759 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="800ms" Dec 02 20:02:00 crc kubenswrapper[4807]: E1202 20:02:00.832407 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="1.6s" Dec 02 20:02:02 crc kubenswrapper[4807]: E1202 20:02:02.433920 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="3.2s" Dec 02 20:02:04 crc kubenswrapper[4807]: I1202 20:02:04.972037 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:02:04 crc kubenswrapper[4807]: I1202 20:02:04.975709 4807 status_manager.go:851] "Failed to get status for pod" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-v4lp2\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:04 crc kubenswrapper[4807]: I1202 20:02:04.975994 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:04 crc kubenswrapper[4807]: I1202 20:02:04.976247 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:04 crc kubenswrapper[4807]: I1202 20:02:04.976480 4807 status_manager.go:851] "Failed to get status for pod" podUID="f888ee60-73ba-421a-9262-bc59ba72820f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:04 crc kubenswrapper[4807]: I1202 20:02:04.976777 4807 status_manager.go:851] "Failed to get status for pod" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-v4lp2\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:04 crc kubenswrapper[4807]: I1202 20:02:04.977027 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:04 crc kubenswrapper[4807]: I1202 20:02:04.977265 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:04 crc kubenswrapper[4807]: I1202 20:02:04.977495 4807 status_manager.go:851] "Failed to get status for pod" podUID="f888ee60-73ba-421a-9262-bc59ba72820f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:05 crc kubenswrapper[4807]: I1202 20:02:05.004800 4807 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="932e803a-d171-4e99-b9bc-4776e51bfc97" Dec 02 20:02:05 crc kubenswrapper[4807]: I1202 20:02:05.004852 4807 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="932e803a-d171-4e99-b9bc-4776e51bfc97" Dec 02 20:02:05 crc kubenswrapper[4807]: E1202 20:02:05.005441 4807 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:02:05 crc kubenswrapper[4807]: I1202 20:02:05.006156 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:02:05 crc kubenswrapper[4807]: W1202 20:02:05.043505 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-15ff84120610029a009387971e912d95d1e016cbd9d0fe26c536467e74884c10 WatchSource:0}: Error finding container 15ff84120610029a009387971e912d95d1e016cbd9d0fe26c536467e74884c10: Status 404 returned error can't find the container with id 15ff84120610029a009387971e912d95d1e016cbd9d0fe26c536467e74884c10 Dec 02 20:02:05 crc kubenswrapper[4807]: I1202 20:02:05.340167 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"15ff84120610029a009387971e912d95d1e016cbd9d0fe26c536467e74884c10"} Dec 02 20:02:05 crc kubenswrapper[4807]: E1202 20:02:05.635251 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.36:6443: connect: connection refused" interval="6.4s" Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.043157 4807 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:54038->192.168.126.11:10257: read: connection reset by peer" start-of-body= Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.043232 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:54038->192.168.126.11:10257: read: connection reset by peer" Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.350917 4807 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="6efcb0edde6a627230b71085ce3b17c084be2837f47f5cd7ba3a06f6cd881a01" exitCode=0 Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.350984 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"6efcb0edde6a627230b71085ce3b17c084be2837f47f5cd7ba3a06f6cd881a01"} Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.351428 4807 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="932e803a-d171-4e99-b9bc-4776e51bfc97" Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.351482 4807 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="932e803a-d171-4e99-b9bc-4776e51bfc97" Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.352235 4807 status_manager.go:851] "Failed to get status for pod" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-v4lp2\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:06 crc kubenswrapper[4807]: E1202 20:02:06.352279 4807 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.353041 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.353620 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.354069 4807 status_manager.go:851] "Failed to get status for pod" podUID="f888ee60-73ba-421a-9262-bc59ba72820f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.355736 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.355856 4807 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c" exitCode=1 Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.355900 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c"} Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.356532 4807 scope.go:117] "RemoveContainer" containerID="fafa4ab4d19eb2b4f1993fef6b574cd4e63a244b46a37b4a4f6439868674222c" Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.356926 4807 status_manager.go:851] "Failed to get status for pod" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-v4lp2\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.357348 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.357748 4807 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.358009 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:06 crc kubenswrapper[4807]: I1202 20:02:06.358285 4807 status_manager.go:851] "Failed to get status for pod" podUID="f888ee60-73ba-421a-9262-bc59ba72820f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:07 crc kubenswrapper[4807]: I1202 20:02:07.374498 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fabcbd74e3a4303ae19a561a540f567191b7bf72de2f4bc4ccdddbb82d64570b"} Dec 02 20:02:07 crc kubenswrapper[4807]: I1202 20:02:07.379881 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 20:02:07 crc kubenswrapper[4807]: I1202 20:02:07.379946 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"023b64bda1ad7899ffa2f2f2ac714fb1d393d4b7039561cd469f97a9dc674935"} Dec 02 20:02:07 crc kubenswrapper[4807]: I1202 20:02:07.381624 4807 status_manager.go:851] "Failed to get status for pod" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" pod="openshift-authentication/oauth-openshift-558db77b4-v4lp2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-v4lp2\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:07 crc kubenswrapper[4807]: I1202 20:02:07.382506 4807 status_manager.go:851] "Failed to get status for pod" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" pod="openshift-marketplace/redhat-operators-7hz9f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7hz9f\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:07 crc kubenswrapper[4807]: I1202 20:02:07.383057 4807 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:07 crc kubenswrapper[4807]: I1202 20:02:07.383442 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:07 crc kubenswrapper[4807]: I1202 20:02:07.383904 4807 status_manager.go:851] "Failed to get status for pod" podUID="f888ee60-73ba-421a-9262-bc59ba72820f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.36:6443: connect: connection refused" Dec 02 20:02:09 crc kubenswrapper[4807]: I1202 20:02:09.396121 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cc74c31a59e4e2d1d51b212c958fa627c9643030210f7df7c62819ed79a51e99"} Dec 02 20:02:09 crc kubenswrapper[4807]: I1202 20:02:09.396696 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:02:09 crc kubenswrapper[4807]: I1202 20:02:09.396708 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c9827dbffa8fc56a33ffbf44750769649cbf5e99d778bb1ba96fa29786db958a"} Dec 02 20:02:09 crc kubenswrapper[4807]: I1202 20:02:09.396470 4807 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="932e803a-d171-4e99-b9bc-4776e51bfc97" Dec 02 20:02:09 crc kubenswrapper[4807]: I1202 20:02:09.396757 4807 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="932e803a-d171-4e99-b9bc-4776e51bfc97" Dec 02 20:02:09 crc kubenswrapper[4807]: I1202 20:02:09.396735 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5f5f56ab0b9966326917779ae1b7786d8c864c88e415261d5b7b8c0d306c4096"} Dec 02 20:02:09 crc kubenswrapper[4807]: I1202 20:02:09.396927 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7e625c1bc9b45893e3815f1537de9de540b157ba4c6028de1a89175021b1f757"} Dec 02 20:02:10 crc kubenswrapper[4807]: I1202 20:02:10.007215 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:02:10 crc kubenswrapper[4807]: I1202 20:02:10.007605 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:02:10 crc kubenswrapper[4807]: I1202 20:02:10.014886 4807 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]log ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]etcd ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/generic-apiserver-start-informers ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/priority-and-fairness-filter ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/start-apiextensions-informers ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/start-apiextensions-controllers ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/crd-informer-synced ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/start-system-namespaces-controller ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 02 20:02:10 crc kubenswrapper[4807]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/bootstrap-controller ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/start-kube-aggregator-informers ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/apiservice-registration-controller ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/apiservice-discovery-controller ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]autoregister-completion ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/apiservice-openapi-controller ok Dec 02 20:02:10 crc kubenswrapper[4807]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 02 20:02:10 crc kubenswrapper[4807]: livez check failed Dec 02 20:02:10 crc kubenswrapper[4807]: I1202 20:02:10.015826 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 20:02:14 crc kubenswrapper[4807]: I1202 20:02:14.406286 4807 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:02:14 crc kubenswrapper[4807]: I1202 20:02:14.993690 4807 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="30136c4e-0246-484f-b3a5-5298365a68b2" Dec 02 20:02:15 crc kubenswrapper[4807]: I1202 20:02:15.432635 4807 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="932e803a-d171-4e99-b9bc-4776e51bfc97" Dec 02 20:02:15 crc kubenswrapper[4807]: I1202 20:02:15.432675 4807 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="932e803a-d171-4e99-b9bc-4776e51bfc97" Dec 02 20:02:15 crc kubenswrapper[4807]: I1202 20:02:15.437059 4807 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="30136c4e-0246-484f-b3a5-5298365a68b2" Dec 02 20:02:15 crc kubenswrapper[4807]: I1202 20:02:15.548289 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 20:02:15 crc kubenswrapper[4807]: I1202 20:02:15.552676 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 20:02:16 crc kubenswrapper[4807]: I1202 20:02:16.029574 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 20:02:17 crc kubenswrapper[4807]: I1202 20:02:17.449928 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 20:02:23 crc kubenswrapper[4807]: I1202 20:02:23.789055 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 02 20:02:24 crc kubenswrapper[4807]: I1202 20:02:24.536759 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 02 20:02:24 crc kubenswrapper[4807]: I1202 20:02:24.554803 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 20:02:24 crc kubenswrapper[4807]: I1202 20:02:24.644814 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 02 20:02:25 crc kubenswrapper[4807]: I1202 20:02:25.378525 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 20:02:25 crc kubenswrapper[4807]: I1202 20:02:25.453519 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 02 20:02:25 crc kubenswrapper[4807]: I1202 20:02:25.463919 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 02 20:02:25 crc kubenswrapper[4807]: I1202 20:02:25.500091 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 02 20:02:25 crc kubenswrapper[4807]: I1202 20:02:25.925225 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 02 20:02:26 crc kubenswrapper[4807]: I1202 20:02:26.342789 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 02 20:02:26 crc kubenswrapper[4807]: I1202 20:02:26.711963 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 02 20:02:26 crc kubenswrapper[4807]: I1202 20:02:26.797823 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 02 20:02:26 crc kubenswrapper[4807]: I1202 20:02:26.906686 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 02 20:02:26 crc kubenswrapper[4807]: I1202 20:02:26.922918 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 02 20:02:27 crc kubenswrapper[4807]: I1202 20:02:27.170895 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 02 20:02:27 crc kubenswrapper[4807]: I1202 20:02:27.186445 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 02 20:02:27 crc kubenswrapper[4807]: I1202 20:02:27.226556 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:27.471757 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:27.511215 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:27.533437 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:27.544809 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:27.575671 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:27.737477 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:27.806385 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:27.808096 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:27.879168 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.035236 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.035352 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.066678 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.104222 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.124732 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.154686 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.186031 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.225992 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.246863 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.300969 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.390606 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.404643 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.435488 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.506786 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.613079 4807 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.788226 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.812213 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.850038 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 02 20:02:28 crc kubenswrapper[4807]: I1202 20:02:28.993734 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 02 20:02:29 crc kubenswrapper[4807]: I1202 20:02:29.035523 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 02 20:02:29 crc kubenswrapper[4807]: I1202 20:02:29.036083 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 02 20:02:29 crc kubenswrapper[4807]: I1202 20:02:29.270021 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 02 20:02:29 crc kubenswrapper[4807]: I1202 20:02:29.306211 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 02 20:02:29 crc kubenswrapper[4807]: I1202 20:02:29.400772 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 20:02:29 crc kubenswrapper[4807]: I1202 20:02:29.413677 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 20:02:29 crc kubenswrapper[4807]: I1202 20:02:29.496818 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 20:02:29 crc kubenswrapper[4807]: I1202 20:02:29.564392 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 02 20:02:29 crc kubenswrapper[4807]: I1202 20:02:29.593113 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 02 20:02:29 crc kubenswrapper[4807]: I1202 20:02:29.712512 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 02 20:02:29 crc kubenswrapper[4807]: I1202 20:02:29.746055 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 02 20:02:29 crc kubenswrapper[4807]: I1202 20:02:29.776547 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 20:02:29 crc kubenswrapper[4807]: I1202 20:02:29.776547 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 02 20:02:29 crc kubenswrapper[4807]: I1202 20:02:29.838100 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 02 20:02:29 crc kubenswrapper[4807]: I1202 20:02:29.874752 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.018937 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.078282 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.155586 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.241923 4807 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.276961 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.300910 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.328993 4807 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.330883 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.336511 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=39.336478735 podStartE2EDuration="39.336478735s" podCreationTimestamp="2025-12-02 20:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:02:14.518480229 +0000 UTC m=+269.819387744" watchObservedRunningTime="2025-12-02 20:02:30.336478735 +0000 UTC m=+285.637386270" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.337054 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-marketplace/redhat-operators-7hz9f","openshift-authentication/oauth-openshift-558db77b4-v4lp2"] Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.337129 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.341398 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.360156 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.360138477 podStartE2EDuration="16.360138477s" podCreationTimestamp="2025-12-02 20:02:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:02:30.355476335 +0000 UTC m=+285.656383840" watchObservedRunningTime="2025-12-02 20:02:30.360138477 +0000 UTC m=+285.661045972" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.378566 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.514826 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.571796 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.611089 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.653197 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.748110 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.803371 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.829852 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.836686 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.845292 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.898108 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.915823 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.921241 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.979331 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" path="/var/lib/kubelet/pods/30d55e05-d66f-496a-b4e1-fb6deb38895f/volumes" Dec 02 20:02:30 crc kubenswrapper[4807]: I1202 20:02:30.980003 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" path="/var/lib/kubelet/pods/49f3d78d-194b-4b23-9bbc-cd89cd6fe402/volumes" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.009520 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.039089 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.039363 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.042030 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.098885 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr"] Dec 02 20:02:31 crc kubenswrapper[4807]: E1202 20:02:31.099134 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" containerName="oauth-openshift" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.099150 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" containerName="oauth-openshift" Dec 02 20:02:31 crc kubenswrapper[4807]: E1202 20:02:31.099161 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f888ee60-73ba-421a-9262-bc59ba72820f" containerName="installer" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.099168 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f888ee60-73ba-421a-9262-bc59ba72820f" containerName="installer" Dec 02 20:02:31 crc kubenswrapper[4807]: E1202 20:02:31.099182 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" containerName="extract-utilities" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.099191 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" containerName="extract-utilities" Dec 02 20:02:31 crc kubenswrapper[4807]: E1202 20:02:31.099200 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" containerName="registry-server" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.099207 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" containerName="registry-server" Dec 02 20:02:31 crc kubenswrapper[4807]: E1202 20:02:31.099219 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" containerName="extract-content" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.099225 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" containerName="extract-content" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.099386 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f888ee60-73ba-421a-9262-bc59ba72820f" containerName="installer" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.099402 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f3d78d-194b-4b23-9bbc-cd89cd6fe402" containerName="registry-server" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.099413 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="30d55e05-d66f-496a-b4e1-fb6deb38895f" containerName="oauth-openshift" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.099938 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.102406 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.105813 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.107111 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.107160 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.107605 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.107769 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.107919 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.108241 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.108308 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.108618 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.108732 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.110077 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.113555 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.116790 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.118888 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.140771 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.160714 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.161019 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-router-certs\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.161137 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c6935ccb-3bf8-4016-8840-af311066caa2-audit-policies\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.161245 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.161374 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.161473 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-user-template-login\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.161590 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-user-template-error\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.161695 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.161865 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.161971 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-service-ca\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.162059 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6935ccb-3bf8-4016-8840-af311066caa2-audit-dir\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.162135 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.162308 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-session\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.162406 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njcdw\" (UniqueName: \"kubernetes.io/projected/c6935ccb-3bf8-4016-8840-af311066caa2-kube-api-access-njcdw\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.162739 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.206298 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.215841 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.223838 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.263444 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-user-template-error\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.263505 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.263541 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.263569 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-service-ca\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.263607 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6935ccb-3bf8-4016-8840-af311066caa2-audit-dir\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.263630 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.263654 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-session\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.263676 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njcdw\" (UniqueName: \"kubernetes.io/projected/c6935ccb-3bf8-4016-8840-af311066caa2-kube-api-access-njcdw\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.263741 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.263773 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-router-certs\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.263811 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c6935ccb-3bf8-4016-8840-af311066caa2-audit-policies\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.263844 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.263850 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6935ccb-3bf8-4016-8840-af311066caa2-audit-dir\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.263881 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.263950 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-user-template-login\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.265857 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-service-ca\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.267111 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c6935ccb-3bf8-4016-8840-af311066caa2-audit-policies\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.267581 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.267655 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.270997 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-user-template-error\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.271100 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.271157 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-session\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.271418 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.271550 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.272691 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.277440 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-router-certs\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.277847 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.284366 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c6935ccb-3bf8-4016-8840-af311066caa2-v4-0-config-user-template-login\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.287331 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njcdw\" (UniqueName: \"kubernetes.io/projected/c6935ccb-3bf8-4016-8840-af311066caa2-kube-api-access-njcdw\") pod \"oauth-openshift-6dc597b7cf-wk5mr\" (UID: \"c6935ccb-3bf8-4016-8840-af311066caa2\") " pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.314990 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.329608 4807 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.340254 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.342744 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.377118 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.418053 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.449956 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.551827 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.574562 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.644601 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.828511 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.942155 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.957620 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 02 20:02:31 crc kubenswrapper[4807]: I1202 20:02:31.980164 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.022406 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.084488 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.100292 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.171930 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.173279 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.265374 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.318699 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.366875 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.427983 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.562009 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.625501 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.648568 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.699304 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.750571 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.752306 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.772415 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.848963 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 02 20:02:32 crc kubenswrapper[4807]: I1202 20:02:32.910859 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.124861 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.166374 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.175857 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.344276 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.345311 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.384138 4807 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.416265 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.422568 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr"] Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.441788 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.557605 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.639570 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr"] Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.655046 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.658118 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.688966 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.699218 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.711248 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.730170 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.741315 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.745786 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.802483 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.853977 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.858341 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.897037 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.898129 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 20:02:33 crc kubenswrapper[4807]: I1202 20:02:33.976619 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.047018 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.048301 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.094281 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.106473 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.175158 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.295182 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.317564 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.352854 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.360828 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.395062 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.397010 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.442661 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.518867 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.538753 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.567292 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" event={"ID":"c6935ccb-3bf8-4016-8840-af311066caa2","Type":"ContainerStarted","Data":"d9a522b064517a99df0d4231290044f04b47759eec90433c3215c1281e7b55cb"} Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.567354 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" event={"ID":"c6935ccb-3bf8-4016-8840-af311066caa2","Type":"ContainerStarted","Data":"2d6c1487c6e9e4f5b6ec41bb1ccd22bc790cbf039a083fa27b5eda203c74213b"} Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.568246 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.571169 4807 patch_prober.go:28] interesting pod/oauth-openshift-6dc597b7cf-wk5mr container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" start-of-body= Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.571262 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" podUID="c6935ccb-3bf8-4016-8840-af311066caa2" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.583494 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.599241 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" podStartSLOduration=71.599215462 podStartE2EDuration="1m11.599215462s" podCreationTimestamp="2025-12-02 20:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:02:34.596797828 +0000 UTC m=+289.897705323" watchObservedRunningTime="2025-12-02 20:02:34.599215462 +0000 UTC m=+289.900122947" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.647229 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.759954 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.785264 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 02 20:02:34 crc kubenswrapper[4807]: I1202 20:02:34.842949 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.016288 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.017292 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.021844 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.023072 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.033121 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.111465 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.137467 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.146747 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.180022 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.296954 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.319288 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.360014 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.476668 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.495280 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.578894 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6dc597b7cf-wk5mr" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.586468 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.654570 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.686549 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.687497 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.759798 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.931854 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 02 20:02:35 crc kubenswrapper[4807]: I1202 20:02:35.974177 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 02 20:02:36 crc kubenswrapper[4807]: I1202 20:02:36.255478 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 02 20:02:36 crc kubenswrapper[4807]: I1202 20:02:36.308507 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 02 20:02:36 crc kubenswrapper[4807]: I1202 20:02:36.482125 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 02 20:02:36 crc kubenswrapper[4807]: I1202 20:02:36.578001 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 20:02:36 crc kubenswrapper[4807]: I1202 20:02:36.611991 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 02 20:02:36 crc kubenswrapper[4807]: I1202 20:02:36.664938 4807 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 20:02:36 crc kubenswrapper[4807]: I1202 20:02:36.665182 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://055b6824a58294033ccbf278b111975f6fc2b7526fc07a89364833c0e6505f2c" gracePeriod=5 Dec 02 20:02:36 crc kubenswrapper[4807]: I1202 20:02:36.738307 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 20:02:36 crc kubenswrapper[4807]: I1202 20:02:36.841984 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 02 20:02:36 crc kubenswrapper[4807]: I1202 20:02:36.848653 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 02 20:02:36 crc kubenswrapper[4807]: I1202 20:02:36.880540 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.032632 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.073120 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.130402 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.132634 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.176291 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.248351 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.294530 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.294763 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.315654 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.329905 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.331876 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.406448 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.433735 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.435443 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.481776 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.550021 4807 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.597469 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.610067 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.629001 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 02 20:02:37 crc kubenswrapper[4807]: I1202 20:02:37.843182 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 02 20:02:38 crc kubenswrapper[4807]: I1202 20:02:38.009989 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 02 20:02:38 crc kubenswrapper[4807]: I1202 20:02:38.221411 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 02 20:02:38 crc kubenswrapper[4807]: I1202 20:02:38.224399 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 02 20:02:38 crc kubenswrapper[4807]: I1202 20:02:38.267219 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 02 20:02:38 crc kubenswrapper[4807]: I1202 20:02:38.406336 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 02 20:02:38 crc kubenswrapper[4807]: I1202 20:02:38.417293 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 02 20:02:38 crc kubenswrapper[4807]: I1202 20:02:38.447659 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 02 20:02:38 crc kubenswrapper[4807]: I1202 20:02:38.478206 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 02 20:02:38 crc kubenswrapper[4807]: I1202 20:02:38.613261 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 02 20:02:38 crc kubenswrapper[4807]: I1202 20:02:38.616786 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 02 20:02:38 crc kubenswrapper[4807]: I1202 20:02:38.772875 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 02 20:02:38 crc kubenswrapper[4807]: I1202 20:02:38.946613 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 02 20:02:39 crc kubenswrapper[4807]: I1202 20:02:39.245563 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 02 20:02:39 crc kubenswrapper[4807]: I1202 20:02:39.414907 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 20:02:39 crc kubenswrapper[4807]: I1202 20:02:39.424264 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 02 20:02:39 crc kubenswrapper[4807]: I1202 20:02:39.428889 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 20:02:39 crc kubenswrapper[4807]: I1202 20:02:39.588128 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 02 20:02:39 crc kubenswrapper[4807]: I1202 20:02:39.757421 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 02 20:02:39 crc kubenswrapper[4807]: I1202 20:02:39.775068 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 02 20:02:39 crc kubenswrapper[4807]: I1202 20:02:39.796112 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 20:02:40 crc kubenswrapper[4807]: I1202 20:02:40.162885 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 02 20:02:40 crc kubenswrapper[4807]: I1202 20:02:40.282856 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 02 20:02:40 crc kubenswrapper[4807]: I1202 20:02:40.286334 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 20:02:40 crc kubenswrapper[4807]: I1202 20:02:40.603584 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.628347 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.628827 4807 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="055b6824a58294033ccbf278b111975f6fc2b7526fc07a89364833c0e6505f2c" exitCode=137 Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.753938 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.754596 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.878676 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.878807 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.878871 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.878909 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.878933 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.878946 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.878996 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.879029 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.879066 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.879331 4807 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.879362 4807 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.879378 4807 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.879393 4807 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.888833 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.980214 4807 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.982051 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 02 20:02:44 crc kubenswrapper[4807]: I1202 20:02:44.982448 4807 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 02 20:02:45 crc kubenswrapper[4807]: I1202 20:02:45.000659 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 20:02:45 crc kubenswrapper[4807]: I1202 20:02:45.000752 4807 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c4b32b33-5cbf-434d-a010-2774759ae707" Dec 02 20:02:45 crc kubenswrapper[4807]: I1202 20:02:45.011284 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 20:02:45 crc kubenswrapper[4807]: I1202 20:02:45.011436 4807 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c4b32b33-5cbf-434d-a010-2774759ae707" Dec 02 20:02:45 crc kubenswrapper[4807]: I1202 20:02:45.637873 4807 scope.go:117] "RemoveContainer" containerID="055b6824a58294033ccbf278b111975f6fc2b7526fc07a89364833c0e6505f2c" Dec 02 20:02:45 crc kubenswrapper[4807]: I1202 20:02:45.637935 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 20:02:56 crc kubenswrapper[4807]: I1202 20:02:56.632518 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 02 20:02:58 crc kubenswrapper[4807]: I1202 20:02:58.670500 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kdp2m"] Dec 02 20:02:58 crc kubenswrapper[4807]: I1202 20:02:58.671080 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" podUID="af5c02f0-23a0-45e5-80ae-3510d6d908dc" containerName="controller-manager" containerID="cri-o://94b4bd9bd141f1e493e198941d44f0cae91e20feb5b8883f09aa143a34f25472" gracePeriod=30 Dec 02 20:02:58 crc kubenswrapper[4807]: I1202 20:02:58.772441 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb"] Dec 02 20:02:58 crc kubenswrapper[4807]: I1202 20:02:58.772780 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" podUID="10c98701-5de0-4c9b-a109-01a2985dc868" containerName="route-controller-manager" containerID="cri-o://491ce146d958b78a2d4b9e7f0f29d2e12f0651ac144c38915fb676b3b2e55af4" gracePeriod=30 Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.546500 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.626655 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.709193 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af5c02f0-23a0-45e5-80ae-3510d6d908dc-serving-cert\") pod \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.709286 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz285\" (UniqueName: \"kubernetes.io/projected/af5c02f0-23a0-45e5-80ae-3510d6d908dc-kube-api-access-bz285\") pod \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.709322 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-proxy-ca-bundles\") pod \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.709362 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10c98701-5de0-4c9b-a109-01a2985dc868-serving-cert\") pod \"10c98701-5de0-4c9b-a109-01a2985dc868\" (UID: \"10c98701-5de0-4c9b-a109-01a2985dc868\") " Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.709391 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-config\") pod \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.709412 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10c98701-5de0-4c9b-a109-01a2985dc868-client-ca\") pod \"10c98701-5de0-4c9b-a109-01a2985dc868\" (UID: \"10c98701-5de0-4c9b-a109-01a2985dc868\") " Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.709435 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljw4w\" (UniqueName: \"kubernetes.io/projected/10c98701-5de0-4c9b-a109-01a2985dc868-kube-api-access-ljw4w\") pod \"10c98701-5de0-4c9b-a109-01a2985dc868\" (UID: \"10c98701-5de0-4c9b-a109-01a2985dc868\") " Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.709484 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-client-ca\") pod \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\" (UID: \"af5c02f0-23a0-45e5-80ae-3510d6d908dc\") " Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.709543 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10c98701-5de0-4c9b-a109-01a2985dc868-config\") pod \"10c98701-5de0-4c9b-a109-01a2985dc868\" (UID: \"10c98701-5de0-4c9b-a109-01a2985dc868\") " Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.710176 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "af5c02f0-23a0-45e5-80ae-3510d6d908dc" (UID: "af5c02f0-23a0-45e5-80ae-3510d6d908dc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.710206 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10c98701-5de0-4c9b-a109-01a2985dc868-client-ca" (OuterVolumeSpecName: "client-ca") pod "10c98701-5de0-4c9b-a109-01a2985dc868" (UID: "10c98701-5de0-4c9b-a109-01a2985dc868"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.710318 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-config" (OuterVolumeSpecName: "config") pod "af5c02f0-23a0-45e5-80ae-3510d6d908dc" (UID: "af5c02f0-23a0-45e5-80ae-3510d6d908dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.710529 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10c98701-5de0-4c9b-a109-01a2985dc868-config" (OuterVolumeSpecName: "config") pod "10c98701-5de0-4c9b-a109-01a2985dc868" (UID: "10c98701-5de0-4c9b-a109-01a2985dc868"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.710590 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-client-ca" (OuterVolumeSpecName: "client-ca") pod "af5c02f0-23a0-45e5-80ae-3510d6d908dc" (UID: "af5c02f0-23a0-45e5-80ae-3510d6d908dc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.718318 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af5c02f0-23a0-45e5-80ae-3510d6d908dc-kube-api-access-bz285" (OuterVolumeSpecName: "kube-api-access-bz285") pod "af5c02f0-23a0-45e5-80ae-3510d6d908dc" (UID: "af5c02f0-23a0-45e5-80ae-3510d6d908dc"). InnerVolumeSpecName "kube-api-access-bz285". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.718517 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af5c02f0-23a0-45e5-80ae-3510d6d908dc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "af5c02f0-23a0-45e5-80ae-3510d6d908dc" (UID: "af5c02f0-23a0-45e5-80ae-3510d6d908dc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.719059 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c98701-5de0-4c9b-a109-01a2985dc868-kube-api-access-ljw4w" (OuterVolumeSpecName: "kube-api-access-ljw4w") pod "10c98701-5de0-4c9b-a109-01a2985dc868" (UID: "10c98701-5de0-4c9b-a109-01a2985dc868"). InnerVolumeSpecName "kube-api-access-ljw4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.727452 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c98701-5de0-4c9b-a109-01a2985dc868-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "10c98701-5de0-4c9b-a109-01a2985dc868" (UID: "10c98701-5de0-4c9b-a109-01a2985dc868"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.741640 4807 generic.go:334] "Generic (PLEG): container finished" podID="10c98701-5de0-4c9b-a109-01a2985dc868" containerID="491ce146d958b78a2d4b9e7f0f29d2e12f0651ac144c38915fb676b3b2e55af4" exitCode=0 Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.741707 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.741741 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" event={"ID":"10c98701-5de0-4c9b-a109-01a2985dc868","Type":"ContainerDied","Data":"491ce146d958b78a2d4b9e7f0f29d2e12f0651ac144c38915fb676b3b2e55af4"} Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.741815 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb" event={"ID":"10c98701-5de0-4c9b-a109-01a2985dc868","Type":"ContainerDied","Data":"8bff21502f379b82383a01094aab4ef1552bbc6054025805aa10bbfe2900a0aa"} Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.741836 4807 scope.go:117] "RemoveContainer" containerID="491ce146d958b78a2d4b9e7f0f29d2e12f0651ac144c38915fb676b3b2e55af4" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.744445 4807 generic.go:334] "Generic (PLEG): container finished" podID="af5c02f0-23a0-45e5-80ae-3510d6d908dc" containerID="94b4bd9bd141f1e493e198941d44f0cae91e20feb5b8883f09aa143a34f25472" exitCode=0 Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.744476 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" event={"ID":"af5c02f0-23a0-45e5-80ae-3510d6d908dc","Type":"ContainerDied","Data":"94b4bd9bd141f1e493e198941d44f0cae91e20feb5b8883f09aa143a34f25472"} Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.744493 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" event={"ID":"af5c02f0-23a0-45e5-80ae-3510d6d908dc","Type":"ContainerDied","Data":"0479ccaa47fd86eae3a777649432bb87da764ddd54cc20950bf3cf98bd540e34"} Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.744551 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-kdp2m" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.757176 4807 scope.go:117] "RemoveContainer" containerID="491ce146d958b78a2d4b9e7f0f29d2e12f0651ac144c38915fb676b3b2e55af4" Dec 02 20:02:59 crc kubenswrapper[4807]: E1202 20:02:59.757659 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"491ce146d958b78a2d4b9e7f0f29d2e12f0651ac144c38915fb676b3b2e55af4\": container with ID starting with 491ce146d958b78a2d4b9e7f0f29d2e12f0651ac144c38915fb676b3b2e55af4 not found: ID does not exist" containerID="491ce146d958b78a2d4b9e7f0f29d2e12f0651ac144c38915fb676b3b2e55af4" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.757756 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"491ce146d958b78a2d4b9e7f0f29d2e12f0651ac144c38915fb676b3b2e55af4"} err="failed to get container status \"491ce146d958b78a2d4b9e7f0f29d2e12f0651ac144c38915fb676b3b2e55af4\": rpc error: code = NotFound desc = could not find container \"491ce146d958b78a2d4b9e7f0f29d2e12f0651ac144c38915fb676b3b2e55af4\": container with ID starting with 491ce146d958b78a2d4b9e7f0f29d2e12f0651ac144c38915fb676b3b2e55af4 not found: ID does not exist" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.757778 4807 scope.go:117] "RemoveContainer" containerID="94b4bd9bd141f1e493e198941d44f0cae91e20feb5b8883f09aa143a34f25472" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.773280 4807 scope.go:117] "RemoveContainer" containerID="94b4bd9bd141f1e493e198941d44f0cae91e20feb5b8883f09aa143a34f25472" Dec 02 20:02:59 crc kubenswrapper[4807]: E1202 20:02:59.774238 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94b4bd9bd141f1e493e198941d44f0cae91e20feb5b8883f09aa143a34f25472\": container with ID starting with 94b4bd9bd141f1e493e198941d44f0cae91e20feb5b8883f09aa143a34f25472 not found: ID does not exist" containerID="94b4bd9bd141f1e493e198941d44f0cae91e20feb5b8883f09aa143a34f25472" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.774327 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94b4bd9bd141f1e493e198941d44f0cae91e20feb5b8883f09aa143a34f25472"} err="failed to get container status \"94b4bd9bd141f1e493e198941d44f0cae91e20feb5b8883f09aa143a34f25472\": rpc error: code = NotFound desc = could not find container \"94b4bd9bd141f1e493e198941d44f0cae91e20feb5b8883f09aa143a34f25472\": container with ID starting with 94b4bd9bd141f1e493e198941d44f0cae91e20feb5b8883f09aa143a34f25472 not found: ID does not exist" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.778315 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kdp2m"] Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.781344 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-kdp2m"] Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.792432 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb"] Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.798343 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rzpzb"] Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.811512 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af5c02f0-23a0-45e5-80ae-3510d6d908dc-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.811550 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz285\" (UniqueName: \"kubernetes.io/projected/af5c02f0-23a0-45e5-80ae-3510d6d908dc-kube-api-access-bz285\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.811563 4807 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.811573 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10c98701-5de0-4c9b-a109-01a2985dc868-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.811582 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.811591 4807 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10c98701-5de0-4c9b-a109-01a2985dc868-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.811599 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljw4w\" (UniqueName: \"kubernetes.io/projected/10c98701-5de0-4c9b-a109-01a2985dc868-kube-api-access-ljw4w\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.811607 4807 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af5c02f0-23a0-45e5-80ae-3510d6d908dc-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.811615 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10c98701-5de0-4c9b-a109-01a2985dc868-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.947929 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d7777dc4b-9cbsv"] Dec 02 20:02:59 crc kubenswrapper[4807]: E1202 20:02:59.948459 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c98701-5de0-4c9b-a109-01a2985dc868" containerName="route-controller-manager" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.948475 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c98701-5de0-4c9b-a109-01a2985dc868" containerName="route-controller-manager" Dec 02 20:02:59 crc kubenswrapper[4807]: E1202 20:02:59.948491 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.948498 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 20:02:59 crc kubenswrapper[4807]: E1202 20:02:59.948510 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af5c02f0-23a0-45e5-80ae-3510d6d908dc" containerName="controller-manager" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.948518 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="af5c02f0-23a0-45e5-80ae-3510d6d908dc" containerName="controller-manager" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.948621 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.948642 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c98701-5de0-4c9b-a109-01a2985dc868" containerName="route-controller-manager" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.948657 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="af5c02f0-23a0-45e5-80ae-3510d6d908dc" containerName="controller-manager" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.949356 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.951744 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.951906 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.952133 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.952700 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.952900 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.953013 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.955589 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp"] Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.956514 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.966897 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d7777dc4b-9cbsv"] Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.967117 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.967624 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.968106 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.968834 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.968867 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.971579 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.988112 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp"] Dec 02 20:02:59 crc kubenswrapper[4807]: I1202 20:02:59.988128 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.114184 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-serving-cert\") pod \"route-controller-manager-6c47b8f47d-zwhwp\" (UID: \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\") " pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.114254 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-config\") pod \"route-controller-manager-6c47b8f47d-zwhwp\" (UID: \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\") " pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.114309 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-proxy-ca-bundles\") pod \"controller-manager-d7777dc4b-9cbsv\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.114338 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20e3186d-718c-4b4b-8a41-a6e10ae0439a-serving-cert\") pod \"controller-manager-d7777dc4b-9cbsv\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.114366 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46wjj\" (UniqueName: \"kubernetes.io/projected/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-kube-api-access-46wjj\") pod \"route-controller-manager-6c47b8f47d-zwhwp\" (UID: \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\") " pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.114398 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fgtq\" (UniqueName: \"kubernetes.io/projected/20e3186d-718c-4b4b-8a41-a6e10ae0439a-kube-api-access-4fgtq\") pod \"controller-manager-d7777dc4b-9cbsv\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.114472 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-client-ca\") pod \"controller-manager-d7777dc4b-9cbsv\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.114502 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-client-ca\") pod \"route-controller-manager-6c47b8f47d-zwhwp\" (UID: \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\") " pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.114525 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-config\") pod \"controller-manager-d7777dc4b-9cbsv\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.215599 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-client-ca\") pod \"controller-manager-d7777dc4b-9cbsv\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.215672 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-client-ca\") pod \"route-controller-manager-6c47b8f47d-zwhwp\" (UID: \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\") " pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.215697 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-config\") pod \"controller-manager-d7777dc4b-9cbsv\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.215743 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-serving-cert\") pod \"route-controller-manager-6c47b8f47d-zwhwp\" (UID: \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\") " pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.215771 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-config\") pod \"route-controller-manager-6c47b8f47d-zwhwp\" (UID: \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\") " pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.216875 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-client-ca\") pod \"route-controller-manager-6c47b8f47d-zwhwp\" (UID: \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\") " pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.217072 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-config\") pod \"route-controller-manager-6c47b8f47d-zwhwp\" (UID: \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\") " pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.215793 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-proxy-ca-bundles\") pod \"controller-manager-d7777dc4b-9cbsv\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.217152 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20e3186d-718c-4b4b-8a41-a6e10ae0439a-serving-cert\") pod \"controller-manager-d7777dc4b-9cbsv\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.217177 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46wjj\" (UniqueName: \"kubernetes.io/projected/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-kube-api-access-46wjj\") pod \"route-controller-manager-6c47b8f47d-zwhwp\" (UID: \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\") " pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.217259 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-proxy-ca-bundles\") pod \"controller-manager-d7777dc4b-9cbsv\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.217470 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-config\") pod \"controller-manager-d7777dc4b-9cbsv\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.217803 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fgtq\" (UniqueName: \"kubernetes.io/projected/20e3186d-718c-4b4b-8a41-a6e10ae0439a-kube-api-access-4fgtq\") pod \"controller-manager-d7777dc4b-9cbsv\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.218085 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-client-ca\") pod \"controller-manager-d7777dc4b-9cbsv\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.222422 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20e3186d-718c-4b4b-8a41-a6e10ae0439a-serving-cert\") pod \"controller-manager-d7777dc4b-9cbsv\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.222418 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-serving-cert\") pod \"route-controller-manager-6c47b8f47d-zwhwp\" (UID: \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\") " pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.235819 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46wjj\" (UniqueName: \"kubernetes.io/projected/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-kube-api-access-46wjj\") pod \"route-controller-manager-6c47b8f47d-zwhwp\" (UID: \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\") " pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.238557 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fgtq\" (UniqueName: \"kubernetes.io/projected/20e3186d-718c-4b4b-8a41-a6e10ae0439a-kube-api-access-4fgtq\") pod \"controller-manager-d7777dc4b-9cbsv\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.301624 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.305257 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.533439 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d7777dc4b-9cbsv"] Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.585748 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp"] Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.752737 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" event={"ID":"20e3186d-718c-4b4b-8a41-a6e10ae0439a","Type":"ContainerStarted","Data":"532f6c84f13e12814be0095c800835d3357408409ba2377e6f52a696ba3061c8"} Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.753565 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" event={"ID":"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac","Type":"ContainerStarted","Data":"18e55c60141bed8de63ac5b136876ae79ad8a785e9718b47c7748ae8c2622d73"} Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.978828 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c98701-5de0-4c9b-a109-01a2985dc868" path="/var/lib/kubelet/pods/10c98701-5de0-4c9b-a109-01a2985dc868/volumes" Dec 02 20:03:00 crc kubenswrapper[4807]: I1202 20:03:00.979622 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af5c02f0-23a0-45e5-80ae-3510d6d908dc" path="/var/lib/kubelet/pods/af5c02f0-23a0-45e5-80ae-3510d6d908dc/volumes" Dec 02 20:03:01 crc kubenswrapper[4807]: I1202 20:03:01.761568 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" event={"ID":"20e3186d-718c-4b4b-8a41-a6e10ae0439a","Type":"ContainerStarted","Data":"ea035b9da12066a2136f9cf9f28b008b27707a0424badac2ff575b55697b602a"} Dec 02 20:03:01 crc kubenswrapper[4807]: I1202 20:03:01.764205 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:01 crc kubenswrapper[4807]: I1202 20:03:01.765814 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" event={"ID":"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac","Type":"ContainerStarted","Data":"5e6e2ebbcd159fc96a36435d00ae336773403c8e4481d733d43ded15a99fe237"} Dec 02 20:03:01 crc kubenswrapper[4807]: I1202 20:03:01.766831 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:01 crc kubenswrapper[4807]: I1202 20:03:01.769680 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:01 crc kubenswrapper[4807]: I1202 20:03:01.778044 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" podStartSLOduration=3.778025651 podStartE2EDuration="3.778025651s" podCreationTimestamp="2025-12-02 20:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:03:01.776409308 +0000 UTC m=+317.077316813" watchObservedRunningTime="2025-12-02 20:03:01.778025651 +0000 UTC m=+317.078933146" Dec 02 20:03:01 crc kubenswrapper[4807]: I1202 20:03:01.779261 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:01 crc kubenswrapper[4807]: I1202 20:03:01.799738 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" podStartSLOduration=3.799711091 podStartE2EDuration="3.799711091s" podCreationTimestamp="2025-12-02 20:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:03:01.797953515 +0000 UTC m=+317.098861010" watchObservedRunningTime="2025-12-02 20:03:01.799711091 +0000 UTC m=+317.100618586" Dec 02 20:03:02 crc kubenswrapper[4807]: I1202 20:03:02.112964 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d7777dc4b-9cbsv"] Dec 02 20:03:02 crc kubenswrapper[4807]: I1202 20:03:02.123454 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp"] Dec 02 20:03:03 crc kubenswrapper[4807]: I1202 20:03:03.777614 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" podUID="20e3186d-718c-4b4b-8a41-a6e10ae0439a" containerName="controller-manager" containerID="cri-o://ea035b9da12066a2136f9cf9f28b008b27707a0424badac2ff575b55697b602a" gracePeriod=30 Dec 02 20:03:03 crc kubenswrapper[4807]: I1202 20:03:03.777917 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" podUID="5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac" containerName="route-controller-manager" containerID="cri-o://5e6e2ebbcd159fc96a36435d00ae336773403c8e4481d733d43ded15a99fe237" gracePeriod=30 Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.147857 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.196502 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j"] Dec 02 20:03:04 crc kubenswrapper[4807]: E1202 20:03:04.196766 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac" containerName="route-controller-manager" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.196780 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac" containerName="route-controller-manager" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.196904 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac" containerName="route-controller-manager" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.197303 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.225380 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j"] Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.240293 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.276809 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-serving-cert\") pod \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\" (UID: \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\") " Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.276864 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46wjj\" (UniqueName: \"kubernetes.io/projected/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-kube-api-access-46wjj\") pod \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\" (UID: \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\") " Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.276927 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-client-ca\") pod \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\" (UID: \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\") " Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.276957 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-config\") pod \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\" (UID: \"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac\") " Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.278150 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-config" (OuterVolumeSpecName: "config") pod "5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac" (UID: "5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.292178 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-client-ca" (OuterVolumeSpecName: "client-ca") pod "5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac" (UID: "5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.292830 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-kube-api-access-46wjj" (OuterVolumeSpecName: "kube-api-access-46wjj") pod "5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac" (UID: "5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac"). InnerVolumeSpecName "kube-api-access-46wjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.293925 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac" (UID: "5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.378186 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20e3186d-718c-4b4b-8a41-a6e10ae0439a-serving-cert\") pod \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.378462 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-client-ca\") pod \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.378546 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fgtq\" (UniqueName: \"kubernetes.io/projected/20e3186d-718c-4b4b-8a41-a6e10ae0439a-kube-api-access-4fgtq\") pod \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.378590 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-config\") pod \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.378665 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-proxy-ca-bundles\") pod \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\" (UID: \"20e3186d-718c-4b4b-8a41-a6e10ae0439a\") " Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.378982 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-client-ca" (OuterVolumeSpecName: "client-ca") pod "20e3186d-718c-4b4b-8a41-a6e10ae0439a" (UID: "20e3186d-718c-4b4b-8a41-a6e10ae0439a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.379006 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12dd037f-62ba-4373-b852-5b8ad447a6dd-serving-cert\") pod \"route-controller-manager-6558d68c-pm64j\" (UID: \"12dd037f-62ba-4373-b852-5b8ad447a6dd\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.379162 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkqn8\" (UniqueName: \"kubernetes.io/projected/12dd037f-62ba-4373-b852-5b8ad447a6dd-kube-api-access-kkqn8\") pod \"route-controller-manager-6558d68c-pm64j\" (UID: \"12dd037f-62ba-4373-b852-5b8ad447a6dd\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.379234 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12dd037f-62ba-4373-b852-5b8ad447a6dd-config\") pod \"route-controller-manager-6558d68c-pm64j\" (UID: \"12dd037f-62ba-4373-b852-5b8ad447a6dd\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.379267 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12dd037f-62ba-4373-b852-5b8ad447a6dd-client-ca\") pod \"route-controller-manager-6558d68c-pm64j\" (UID: \"12dd037f-62ba-4373-b852-5b8ad447a6dd\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.379392 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.379417 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46wjj\" (UniqueName: \"kubernetes.io/projected/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-kube-api-access-46wjj\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.379433 4807 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.379469 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.379481 4807 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.379275 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "20e3186d-718c-4b4b-8a41-a6e10ae0439a" (UID: "20e3186d-718c-4b4b-8a41-a6e10ae0439a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.380161 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-config" (OuterVolumeSpecName: "config") pod "20e3186d-718c-4b4b-8a41-a6e10ae0439a" (UID: "20e3186d-718c-4b4b-8a41-a6e10ae0439a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.381759 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20e3186d-718c-4b4b-8a41-a6e10ae0439a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "20e3186d-718c-4b4b-8a41-a6e10ae0439a" (UID: "20e3186d-718c-4b4b-8a41-a6e10ae0439a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.382143 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e3186d-718c-4b4b-8a41-a6e10ae0439a-kube-api-access-4fgtq" (OuterVolumeSpecName: "kube-api-access-4fgtq") pod "20e3186d-718c-4b4b-8a41-a6e10ae0439a" (UID: "20e3186d-718c-4b4b-8a41-a6e10ae0439a"). InnerVolumeSpecName "kube-api-access-4fgtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.480200 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12dd037f-62ba-4373-b852-5b8ad447a6dd-serving-cert\") pod \"route-controller-manager-6558d68c-pm64j\" (UID: \"12dd037f-62ba-4373-b852-5b8ad447a6dd\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.480272 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkqn8\" (UniqueName: \"kubernetes.io/projected/12dd037f-62ba-4373-b852-5b8ad447a6dd-kube-api-access-kkqn8\") pod \"route-controller-manager-6558d68c-pm64j\" (UID: \"12dd037f-62ba-4373-b852-5b8ad447a6dd\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.480312 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12dd037f-62ba-4373-b852-5b8ad447a6dd-config\") pod \"route-controller-manager-6558d68c-pm64j\" (UID: \"12dd037f-62ba-4373-b852-5b8ad447a6dd\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.480330 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12dd037f-62ba-4373-b852-5b8ad447a6dd-client-ca\") pod \"route-controller-manager-6558d68c-pm64j\" (UID: \"12dd037f-62ba-4373-b852-5b8ad447a6dd\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.480368 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fgtq\" (UniqueName: \"kubernetes.io/projected/20e3186d-718c-4b4b-8a41-a6e10ae0439a-kube-api-access-4fgtq\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.480378 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.480387 4807 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20e3186d-718c-4b4b-8a41-a6e10ae0439a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.480395 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20e3186d-718c-4b4b-8a41-a6e10ae0439a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.481242 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12dd037f-62ba-4373-b852-5b8ad447a6dd-client-ca\") pod \"route-controller-manager-6558d68c-pm64j\" (UID: \"12dd037f-62ba-4373-b852-5b8ad447a6dd\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.482027 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12dd037f-62ba-4373-b852-5b8ad447a6dd-config\") pod \"route-controller-manager-6558d68c-pm64j\" (UID: \"12dd037f-62ba-4373-b852-5b8ad447a6dd\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.487169 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12dd037f-62ba-4373-b852-5b8ad447a6dd-serving-cert\") pod \"route-controller-manager-6558d68c-pm64j\" (UID: \"12dd037f-62ba-4373-b852-5b8ad447a6dd\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.498751 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkqn8\" (UniqueName: \"kubernetes.io/projected/12dd037f-62ba-4373-b852-5b8ad447a6dd-kube-api-access-kkqn8\") pod \"route-controller-manager-6558d68c-pm64j\" (UID: \"12dd037f-62ba-4373-b852-5b8ad447a6dd\") " pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.537154 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.781910 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j"] Dec 02 20:03:04 crc kubenswrapper[4807]: W1202 20:03:04.786125 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12dd037f_62ba_4373_b852_5b8ad447a6dd.slice/crio-477b5e9be803bd7c5b2353f614fbcdaf6e17340ac40ca8a74936a23d2d28dcea WatchSource:0}: Error finding container 477b5e9be803bd7c5b2353f614fbcdaf6e17340ac40ca8a74936a23d2d28dcea: Status 404 returned error can't find the container with id 477b5e9be803bd7c5b2353f614fbcdaf6e17340ac40ca8a74936a23d2d28dcea Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.788457 4807 generic.go:334] "Generic (PLEG): container finished" podID="5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac" containerID="5e6e2ebbcd159fc96a36435d00ae336773403c8e4481d733d43ded15a99fe237" exitCode=0 Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.788525 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.788616 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" event={"ID":"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac","Type":"ContainerDied","Data":"5e6e2ebbcd159fc96a36435d00ae336773403c8e4481d733d43ded15a99fe237"} Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.788824 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp" event={"ID":"5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac","Type":"ContainerDied","Data":"18e55c60141bed8de63ac5b136876ae79ad8a785e9718b47c7748ae8c2622d73"} Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.788853 4807 scope.go:117] "RemoveContainer" containerID="5e6e2ebbcd159fc96a36435d00ae336773403c8e4481d733d43ded15a99fe237" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.793012 4807 generic.go:334] "Generic (PLEG): container finished" podID="20e3186d-718c-4b4b-8a41-a6e10ae0439a" containerID="ea035b9da12066a2136f9cf9f28b008b27707a0424badac2ff575b55697b602a" exitCode=0 Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.793099 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.793095 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" event={"ID":"20e3186d-718c-4b4b-8a41-a6e10ae0439a","Type":"ContainerDied","Data":"ea035b9da12066a2136f9cf9f28b008b27707a0424badac2ff575b55697b602a"} Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.793269 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d7777dc4b-9cbsv" event={"ID":"20e3186d-718c-4b4b-8a41-a6e10ae0439a","Type":"ContainerDied","Data":"532f6c84f13e12814be0095c800835d3357408409ba2377e6f52a696ba3061c8"} Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.810673 4807 scope.go:117] "RemoveContainer" containerID="5e6e2ebbcd159fc96a36435d00ae336773403c8e4481d733d43ded15a99fe237" Dec 02 20:03:04 crc kubenswrapper[4807]: E1202 20:03:04.811150 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e6e2ebbcd159fc96a36435d00ae336773403c8e4481d733d43ded15a99fe237\": container with ID starting with 5e6e2ebbcd159fc96a36435d00ae336773403c8e4481d733d43ded15a99fe237 not found: ID does not exist" containerID="5e6e2ebbcd159fc96a36435d00ae336773403c8e4481d733d43ded15a99fe237" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.811286 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e6e2ebbcd159fc96a36435d00ae336773403c8e4481d733d43ded15a99fe237"} err="failed to get container status \"5e6e2ebbcd159fc96a36435d00ae336773403c8e4481d733d43ded15a99fe237\": rpc error: code = NotFound desc = could not find container \"5e6e2ebbcd159fc96a36435d00ae336773403c8e4481d733d43ded15a99fe237\": container with ID starting with 5e6e2ebbcd159fc96a36435d00ae336773403c8e4481d733d43ded15a99fe237 not found: ID does not exist" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.811355 4807 scope.go:117] "RemoveContainer" containerID="ea035b9da12066a2136f9cf9f28b008b27707a0424badac2ff575b55697b602a" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.828618 4807 scope.go:117] "RemoveContainer" containerID="ea035b9da12066a2136f9cf9f28b008b27707a0424badac2ff575b55697b602a" Dec 02 20:03:04 crc kubenswrapper[4807]: E1202 20:03:04.829156 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea035b9da12066a2136f9cf9f28b008b27707a0424badac2ff575b55697b602a\": container with ID starting with ea035b9da12066a2136f9cf9f28b008b27707a0424badac2ff575b55697b602a not found: ID does not exist" containerID="ea035b9da12066a2136f9cf9f28b008b27707a0424badac2ff575b55697b602a" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.829201 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea035b9da12066a2136f9cf9f28b008b27707a0424badac2ff575b55697b602a"} err="failed to get container status \"ea035b9da12066a2136f9cf9f28b008b27707a0424badac2ff575b55697b602a\": rpc error: code = NotFound desc = could not find container \"ea035b9da12066a2136f9cf9f28b008b27707a0424badac2ff575b55697b602a\": container with ID starting with ea035b9da12066a2136f9cf9f28b008b27707a0424badac2ff575b55697b602a not found: ID does not exist" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.833522 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d7777dc4b-9cbsv"] Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.837953 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-d7777dc4b-9cbsv"] Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.850368 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp"] Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.862074 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c47b8f47d-zwhwp"] Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.983295 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20e3186d-718c-4b4b-8a41-a6e10ae0439a" path="/var/lib/kubelet/pods/20e3186d-718c-4b4b-8a41-a6e10ae0439a/volumes" Dec 02 20:03:04 crc kubenswrapper[4807]: I1202 20:03:04.983924 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac" path="/var/lib/kubelet/pods/5cfd73cf-defc-4b40-98a2-8dcae7b8c0ac/volumes" Dec 02 20:03:05 crc kubenswrapper[4807]: I1202 20:03:05.808796 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" event={"ID":"12dd037f-62ba-4373-b852-5b8ad447a6dd","Type":"ContainerStarted","Data":"4768f1e952f304deecd2e013cbb1e84a548ea613e5312ea928f38f59f995f24f"} Dec 02 20:03:05 crc kubenswrapper[4807]: I1202 20:03:05.810498 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" event={"ID":"12dd037f-62ba-4373-b852-5b8ad447a6dd","Type":"ContainerStarted","Data":"477b5e9be803bd7c5b2353f614fbcdaf6e17340ac40ca8a74936a23d2d28dcea"} Dec 02 20:03:05 crc kubenswrapper[4807]: I1202 20:03:05.810675 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:05 crc kubenswrapper[4807]: I1202 20:03:05.816543 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:05 crc kubenswrapper[4807]: I1202 20:03:05.839706 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" podStartSLOduration=2.839678454 podStartE2EDuration="2.839678454s" podCreationTimestamp="2025-12-02 20:03:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:03:05.831234033 +0000 UTC m=+321.132141548" watchObservedRunningTime="2025-12-02 20:03:05.839678454 +0000 UTC m=+321.140585949" Dec 02 20:03:06 crc kubenswrapper[4807]: I1202 20:03:06.955205 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn"] Dec 02 20:03:06 crc kubenswrapper[4807]: E1202 20:03:06.955862 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e3186d-718c-4b4b-8a41-a6e10ae0439a" containerName="controller-manager" Dec 02 20:03:06 crc kubenswrapper[4807]: I1202 20:03:06.955879 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e3186d-718c-4b4b-8a41-a6e10ae0439a" containerName="controller-manager" Dec 02 20:03:06 crc kubenswrapper[4807]: I1202 20:03:06.955983 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="20e3186d-718c-4b4b-8a41-a6e10ae0439a" containerName="controller-manager" Dec 02 20:03:06 crc kubenswrapper[4807]: I1202 20:03:06.956379 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:06 crc kubenswrapper[4807]: I1202 20:03:06.958575 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 20:03:06 crc kubenswrapper[4807]: I1202 20:03:06.960158 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 20:03:06 crc kubenswrapper[4807]: I1202 20:03:06.960325 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 20:03:06 crc kubenswrapper[4807]: I1202 20:03:06.961495 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 20:03:06 crc kubenswrapper[4807]: I1202 20:03:06.961952 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 20:03:06 crc kubenswrapper[4807]: I1202 20:03:06.962051 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 20:03:06 crc kubenswrapper[4807]: I1202 20:03:06.969003 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 20:03:06 crc kubenswrapper[4807]: I1202 20:03:06.979225 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn"] Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.018457 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zxmh\" (UniqueName: \"kubernetes.io/projected/a3ea161c-4312-4cba-ac64-07ab0399d4b7-kube-api-access-6zxmh\") pod \"controller-manager-7bcb6f9d47-29fpn\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.018538 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3ea161c-4312-4cba-ac64-07ab0399d4b7-serving-cert\") pod \"controller-manager-7bcb6f9d47-29fpn\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.018604 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-proxy-ca-bundles\") pod \"controller-manager-7bcb6f9d47-29fpn\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.018651 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-config\") pod \"controller-manager-7bcb6f9d47-29fpn\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.018704 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-client-ca\") pod \"controller-manager-7bcb6f9d47-29fpn\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.120179 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-config\") pod \"controller-manager-7bcb6f9d47-29fpn\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.120863 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-client-ca\") pod \"controller-manager-7bcb6f9d47-29fpn\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.121154 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zxmh\" (UniqueName: \"kubernetes.io/projected/a3ea161c-4312-4cba-ac64-07ab0399d4b7-kube-api-access-6zxmh\") pod \"controller-manager-7bcb6f9d47-29fpn\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.121339 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3ea161c-4312-4cba-ac64-07ab0399d4b7-serving-cert\") pod \"controller-manager-7bcb6f9d47-29fpn\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.121680 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-proxy-ca-bundles\") pod \"controller-manager-7bcb6f9d47-29fpn\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.122130 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-config\") pod \"controller-manager-7bcb6f9d47-29fpn\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.123485 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-client-ca\") pod \"controller-manager-7bcb6f9d47-29fpn\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.124181 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-proxy-ca-bundles\") pod \"controller-manager-7bcb6f9d47-29fpn\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.129451 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3ea161c-4312-4cba-ac64-07ab0399d4b7-serving-cert\") pod \"controller-manager-7bcb6f9d47-29fpn\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.146465 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zxmh\" (UniqueName: \"kubernetes.io/projected/a3ea161c-4312-4cba-ac64-07ab0399d4b7-kube-api-access-6zxmh\") pod \"controller-manager-7bcb6f9d47-29fpn\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.274370 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.476606 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn"] Dec 02 20:03:07 crc kubenswrapper[4807]: I1202 20:03:07.824563 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" event={"ID":"a3ea161c-4312-4cba-ac64-07ab0399d4b7","Type":"ContainerStarted","Data":"114f2c19312319cba97976a71cb6a5cf0097ad25dbb13a31b6e04d43428c2402"} Dec 02 20:03:09 crc kubenswrapper[4807]: I1202 20:03:09.838480 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" event={"ID":"a3ea161c-4312-4cba-ac64-07ab0399d4b7","Type":"ContainerStarted","Data":"4e92dcc6eda35771bc8949932ba7766175315c8fe681bb529dda9123c41c5ec8"} Dec 02 20:03:09 crc kubenswrapper[4807]: I1202 20:03:09.839221 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:09 crc kubenswrapper[4807]: I1202 20:03:09.847388 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:09 crc kubenswrapper[4807]: I1202 20:03:09.859879 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" podStartSLOduration=6.859861168 podStartE2EDuration="6.859861168s" podCreationTimestamp="2025-12-02 20:03:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:03:09.859848847 +0000 UTC m=+325.160756342" watchObservedRunningTime="2025-12-02 20:03:09.859861168 +0000 UTC m=+325.160768663" Dec 02 20:03:18 crc kubenswrapper[4807]: I1202 20:03:18.708265 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn"] Dec 02 20:03:18 crc kubenswrapper[4807]: I1202 20:03:18.709418 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" podUID="a3ea161c-4312-4cba-ac64-07ab0399d4b7" containerName="controller-manager" containerID="cri-o://4e92dcc6eda35771bc8949932ba7766175315c8fe681bb529dda9123c41c5ec8" gracePeriod=30 Dec 02 20:03:18 crc kubenswrapper[4807]: I1202 20:03:18.719924 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j"] Dec 02 20:03:18 crc kubenswrapper[4807]: I1202 20:03:18.720285 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" podUID="12dd037f-62ba-4373-b852-5b8ad447a6dd" containerName="route-controller-manager" containerID="cri-o://4768f1e952f304deecd2e013cbb1e84a548ea613e5312ea928f38f59f995f24f" gracePeriod=30 Dec 02 20:03:18 crc kubenswrapper[4807]: E1202 20:03:18.837160 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3ea161c_4312_4cba_ac64_07ab0399d4b7.slice/crio-4e92dcc6eda35771bc8949932ba7766175315c8fe681bb529dda9123c41c5ec8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12dd037f_62ba_4373_b852_5b8ad447a6dd.slice/crio-4768f1e952f304deecd2e013cbb1e84a548ea613e5312ea928f38f59f995f24f.scope\": RecentStats: unable to find data in memory cache]" Dec 02 20:03:19 crc kubenswrapper[4807]: I1202 20:03:19.890908 4807 generic.go:334] "Generic (PLEG): container finished" podID="12dd037f-62ba-4373-b852-5b8ad447a6dd" containerID="4768f1e952f304deecd2e013cbb1e84a548ea613e5312ea928f38f59f995f24f" exitCode=0 Dec 02 20:03:19 crc kubenswrapper[4807]: I1202 20:03:19.891013 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" event={"ID":"12dd037f-62ba-4373-b852-5b8ad447a6dd","Type":"ContainerDied","Data":"4768f1e952f304deecd2e013cbb1e84a548ea613e5312ea928f38f59f995f24f"} Dec 02 20:03:19 crc kubenswrapper[4807]: I1202 20:03:19.892934 4807 generic.go:334] "Generic (PLEG): container finished" podID="a3ea161c-4312-4cba-ac64-07ab0399d4b7" containerID="4e92dcc6eda35771bc8949932ba7766175315c8fe681bb529dda9123c41c5ec8" exitCode=0 Dec 02 20:03:19 crc kubenswrapper[4807]: I1202 20:03:19.892992 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" event={"ID":"a3ea161c-4312-4cba-ac64-07ab0399d4b7","Type":"ContainerDied","Data":"4e92dcc6eda35771bc8949932ba7766175315c8fe681bb529dda9123c41c5ec8"} Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.455817 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.492086 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht"] Dec 02 20:03:20 crc kubenswrapper[4807]: E1202 20:03:20.492415 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12dd037f-62ba-4373-b852-5b8ad447a6dd" containerName="route-controller-manager" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.492441 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="12dd037f-62ba-4373-b852-5b8ad447a6dd" containerName="route-controller-manager" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.492609 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="12dd037f-62ba-4373-b852-5b8ad447a6dd" containerName="route-controller-manager" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.493241 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.503744 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.510484 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht"] Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.524394 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12dd037f-62ba-4373-b852-5b8ad447a6dd-config\") pod \"12dd037f-62ba-4373-b852-5b8ad447a6dd\" (UID: \"12dd037f-62ba-4373-b852-5b8ad447a6dd\") " Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.524490 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12dd037f-62ba-4373-b852-5b8ad447a6dd-client-ca\") pod \"12dd037f-62ba-4373-b852-5b8ad447a6dd\" (UID: \"12dd037f-62ba-4373-b852-5b8ad447a6dd\") " Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.524548 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12dd037f-62ba-4373-b852-5b8ad447a6dd-serving-cert\") pod \"12dd037f-62ba-4373-b852-5b8ad447a6dd\" (UID: \"12dd037f-62ba-4373-b852-5b8ad447a6dd\") " Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.524600 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkqn8\" (UniqueName: \"kubernetes.io/projected/12dd037f-62ba-4373-b852-5b8ad447a6dd-kube-api-access-kkqn8\") pod \"12dd037f-62ba-4373-b852-5b8ad447a6dd\" (UID: \"12dd037f-62ba-4373-b852-5b8ad447a6dd\") " Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.525420 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12dd037f-62ba-4373-b852-5b8ad447a6dd-client-ca" (OuterVolumeSpecName: "client-ca") pod "12dd037f-62ba-4373-b852-5b8ad447a6dd" (UID: "12dd037f-62ba-4373-b852-5b8ad447a6dd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.525513 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12dd037f-62ba-4373-b852-5b8ad447a6dd-config" (OuterVolumeSpecName: "config") pod "12dd037f-62ba-4373-b852-5b8ad447a6dd" (UID: "12dd037f-62ba-4373-b852-5b8ad447a6dd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.535505 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12dd037f-62ba-4373-b852-5b8ad447a6dd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "12dd037f-62ba-4373-b852-5b8ad447a6dd" (UID: "12dd037f-62ba-4373-b852-5b8ad447a6dd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.536096 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12dd037f-62ba-4373-b852-5b8ad447a6dd-kube-api-access-kkqn8" (OuterVolumeSpecName: "kube-api-access-kkqn8") pod "12dd037f-62ba-4373-b852-5b8ad447a6dd" (UID: "12dd037f-62ba-4373-b852-5b8ad447a6dd"). InnerVolumeSpecName "kube-api-access-kkqn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.626314 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-config\") pod \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.626401 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-proxy-ca-bundles\") pod \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.626444 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3ea161c-4312-4cba-ac64-07ab0399d4b7-serving-cert\") pod \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.626498 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zxmh\" (UniqueName: \"kubernetes.io/projected/a3ea161c-4312-4cba-ac64-07ab0399d4b7-kube-api-access-6zxmh\") pod \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.626545 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-client-ca\") pod \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\" (UID: \"a3ea161c-4312-4cba-ac64-07ab0399d4b7\") " Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.626908 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x8rb\" (UniqueName: \"kubernetes.io/projected/3439f371-6c61-403d-b629-b7314af13736-kube-api-access-4x8rb\") pod \"route-controller-manager-57467f5656-pjzht\" (UID: \"3439f371-6c61-403d-b629-b7314af13736\") " pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.626978 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3439f371-6c61-403d-b629-b7314af13736-serving-cert\") pod \"route-controller-manager-57467f5656-pjzht\" (UID: \"3439f371-6c61-403d-b629-b7314af13736\") " pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.627007 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3439f371-6c61-403d-b629-b7314af13736-config\") pod \"route-controller-manager-57467f5656-pjzht\" (UID: \"3439f371-6c61-403d-b629-b7314af13736\") " pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.627335 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3439f371-6c61-403d-b629-b7314af13736-client-ca\") pod \"route-controller-manager-57467f5656-pjzht\" (UID: \"3439f371-6c61-403d-b629-b7314af13736\") " pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.627398 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a3ea161c-4312-4cba-ac64-07ab0399d4b7" (UID: "a3ea161c-4312-4cba-ac64-07ab0399d4b7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.627537 4807 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12dd037f-62ba-4373-b852-5b8ad447a6dd-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.627565 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12dd037f-62ba-4373-b852-5b8ad447a6dd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.627580 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkqn8\" (UniqueName: \"kubernetes.io/projected/12dd037f-62ba-4373-b852-5b8ad447a6dd-kube-api-access-kkqn8\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.627599 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12dd037f-62ba-4373-b852-5b8ad447a6dd-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.627524 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-client-ca" (OuterVolumeSpecName: "client-ca") pod "a3ea161c-4312-4cba-ac64-07ab0399d4b7" (UID: "a3ea161c-4312-4cba-ac64-07ab0399d4b7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.628152 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-config" (OuterVolumeSpecName: "config") pod "a3ea161c-4312-4cba-ac64-07ab0399d4b7" (UID: "a3ea161c-4312-4cba-ac64-07ab0399d4b7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.630841 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3ea161c-4312-4cba-ac64-07ab0399d4b7-kube-api-access-6zxmh" (OuterVolumeSpecName: "kube-api-access-6zxmh") pod "a3ea161c-4312-4cba-ac64-07ab0399d4b7" (UID: "a3ea161c-4312-4cba-ac64-07ab0399d4b7"). InnerVolumeSpecName "kube-api-access-6zxmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.630882 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ea161c-4312-4cba-ac64-07ab0399d4b7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a3ea161c-4312-4cba-ac64-07ab0399d4b7" (UID: "a3ea161c-4312-4cba-ac64-07ab0399d4b7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.729228 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3439f371-6c61-403d-b629-b7314af13736-client-ca\") pod \"route-controller-manager-57467f5656-pjzht\" (UID: \"3439f371-6c61-403d-b629-b7314af13736\") " pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.729297 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x8rb\" (UniqueName: \"kubernetes.io/projected/3439f371-6c61-403d-b629-b7314af13736-kube-api-access-4x8rb\") pod \"route-controller-manager-57467f5656-pjzht\" (UID: \"3439f371-6c61-403d-b629-b7314af13736\") " pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.729348 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3439f371-6c61-403d-b629-b7314af13736-serving-cert\") pod \"route-controller-manager-57467f5656-pjzht\" (UID: \"3439f371-6c61-403d-b629-b7314af13736\") " pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.729368 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3439f371-6c61-403d-b629-b7314af13736-config\") pod \"route-controller-manager-57467f5656-pjzht\" (UID: \"3439f371-6c61-403d-b629-b7314af13736\") " pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.729418 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.729435 4807 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.729447 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3ea161c-4312-4cba-ac64-07ab0399d4b7-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.729458 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zxmh\" (UniqueName: \"kubernetes.io/projected/a3ea161c-4312-4cba-ac64-07ab0399d4b7-kube-api-access-6zxmh\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.729468 4807 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3ea161c-4312-4cba-ac64-07ab0399d4b7-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.730340 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3439f371-6c61-403d-b629-b7314af13736-client-ca\") pod \"route-controller-manager-57467f5656-pjzht\" (UID: \"3439f371-6c61-403d-b629-b7314af13736\") " pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.730467 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3439f371-6c61-403d-b629-b7314af13736-config\") pod \"route-controller-manager-57467f5656-pjzht\" (UID: \"3439f371-6c61-403d-b629-b7314af13736\") " pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.732964 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3439f371-6c61-403d-b629-b7314af13736-serving-cert\") pod \"route-controller-manager-57467f5656-pjzht\" (UID: \"3439f371-6c61-403d-b629-b7314af13736\") " pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.760167 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x8rb\" (UniqueName: \"kubernetes.io/projected/3439f371-6c61-403d-b629-b7314af13736-kube-api-access-4x8rb\") pod \"route-controller-manager-57467f5656-pjzht\" (UID: \"3439f371-6c61-403d-b629-b7314af13736\") " pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.815599 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.914293 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.914291 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j" event={"ID":"12dd037f-62ba-4373-b852-5b8ad447a6dd","Type":"ContainerDied","Data":"477b5e9be803bd7c5b2353f614fbcdaf6e17340ac40ca8a74936a23d2d28dcea"} Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.914359 4807 scope.go:117] "RemoveContainer" containerID="4768f1e952f304deecd2e013cbb1e84a548ea613e5312ea928f38f59f995f24f" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.918246 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" event={"ID":"a3ea161c-4312-4cba-ac64-07ab0399d4b7","Type":"ContainerDied","Data":"114f2c19312319cba97976a71cb6a5cf0097ad25dbb13a31b6e04d43428c2402"} Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.918341 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn" Dec 02 20:03:20 crc kubenswrapper[4807]: I1202 20:03:20.970105 4807 scope.go:117] "RemoveContainer" containerID="4e92dcc6eda35771bc8949932ba7766175315c8fe681bb529dda9123c41c5ec8" Dec 02 20:03:21 crc kubenswrapper[4807]: I1202 20:03:21.017682 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn"] Dec 02 20:03:21 crc kubenswrapper[4807]: I1202 20:03:21.022781 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7bcb6f9d47-29fpn"] Dec 02 20:03:21 crc kubenswrapper[4807]: I1202 20:03:21.033752 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j"] Dec 02 20:03:21 crc kubenswrapper[4807]: I1202 20:03:21.041803 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6558d68c-pm64j"] Dec 02 20:03:21 crc kubenswrapper[4807]: I1202 20:03:21.080564 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht"] Dec 02 20:03:21 crc kubenswrapper[4807]: I1202 20:03:21.930026 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" event={"ID":"3439f371-6c61-403d-b629-b7314af13736","Type":"ContainerStarted","Data":"6ee5452cee38076d17c164f7e9d2646a6364de5f7f2d3e4e0967eb7096d26dc5"} Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.937639 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" event={"ID":"3439f371-6c61-403d-b629-b7314af13736","Type":"ContainerStarted","Data":"ef3035318380255d36e0cbcdcfa2ccaed4c37b7829f080d44076d32f4f6bac28"} Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.938104 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.946107 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.953537 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57467f5656-pjzht" podStartSLOduration=4.95351692 podStartE2EDuration="4.95351692s" podCreationTimestamp="2025-12-02 20:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:03:22.951893414 +0000 UTC m=+338.252800909" watchObservedRunningTime="2025-12-02 20:03:22.95351692 +0000 UTC m=+338.254424415" Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.967960 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-784c9756bd-dtf9k"] Dec 02 20:03:22 crc kubenswrapper[4807]: E1202 20:03:22.968282 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3ea161c-4312-4cba-ac64-07ab0399d4b7" containerName="controller-manager" Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.968306 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3ea161c-4312-4cba-ac64-07ab0399d4b7" containerName="controller-manager" Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.968443 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3ea161c-4312-4cba-ac64-07ab0399d4b7" containerName="controller-manager" Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.969006 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.973399 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.973412 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.973449 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.973744 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.973862 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.973985 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.981836 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.988977 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12dd037f-62ba-4373-b852-5b8ad447a6dd" path="/var/lib/kubelet/pods/12dd037f-62ba-4373-b852-5b8ad447a6dd/volumes" Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.990158 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3ea161c-4312-4cba-ac64-07ab0399d4b7" path="/var/lib/kubelet/pods/a3ea161c-4312-4cba-ac64-07ab0399d4b7/volumes" Dec 02 20:03:22 crc kubenswrapper[4807]: I1202 20:03:22.993151 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-784c9756bd-dtf9k"] Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.067866 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4777d78f-48e2-4dae-a030-9b89449686e1-config\") pod \"controller-manager-784c9756bd-dtf9k\" (UID: \"4777d78f-48e2-4dae-a030-9b89449686e1\") " pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.067950 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4777d78f-48e2-4dae-a030-9b89449686e1-proxy-ca-bundles\") pod \"controller-manager-784c9756bd-dtf9k\" (UID: \"4777d78f-48e2-4dae-a030-9b89449686e1\") " pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.068050 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-559w4\" (UniqueName: \"kubernetes.io/projected/4777d78f-48e2-4dae-a030-9b89449686e1-kube-api-access-559w4\") pod \"controller-manager-784c9756bd-dtf9k\" (UID: \"4777d78f-48e2-4dae-a030-9b89449686e1\") " pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.068200 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4777d78f-48e2-4dae-a030-9b89449686e1-client-ca\") pod \"controller-manager-784c9756bd-dtf9k\" (UID: \"4777d78f-48e2-4dae-a030-9b89449686e1\") " pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.068259 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4777d78f-48e2-4dae-a030-9b89449686e1-serving-cert\") pod \"controller-manager-784c9756bd-dtf9k\" (UID: \"4777d78f-48e2-4dae-a030-9b89449686e1\") " pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.169581 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4777d78f-48e2-4dae-a030-9b89449686e1-config\") pod \"controller-manager-784c9756bd-dtf9k\" (UID: \"4777d78f-48e2-4dae-a030-9b89449686e1\") " pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.169911 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4777d78f-48e2-4dae-a030-9b89449686e1-proxy-ca-bundles\") pod \"controller-manager-784c9756bd-dtf9k\" (UID: \"4777d78f-48e2-4dae-a030-9b89449686e1\") " pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.170035 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-559w4\" (UniqueName: \"kubernetes.io/projected/4777d78f-48e2-4dae-a030-9b89449686e1-kube-api-access-559w4\") pod \"controller-manager-784c9756bd-dtf9k\" (UID: \"4777d78f-48e2-4dae-a030-9b89449686e1\") " pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.170169 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4777d78f-48e2-4dae-a030-9b89449686e1-client-ca\") pod \"controller-manager-784c9756bd-dtf9k\" (UID: \"4777d78f-48e2-4dae-a030-9b89449686e1\") " pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.170279 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4777d78f-48e2-4dae-a030-9b89449686e1-serving-cert\") pod \"controller-manager-784c9756bd-dtf9k\" (UID: \"4777d78f-48e2-4dae-a030-9b89449686e1\") " pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.171349 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4777d78f-48e2-4dae-a030-9b89449686e1-proxy-ca-bundles\") pod \"controller-manager-784c9756bd-dtf9k\" (UID: \"4777d78f-48e2-4dae-a030-9b89449686e1\") " pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.171479 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4777d78f-48e2-4dae-a030-9b89449686e1-config\") pod \"controller-manager-784c9756bd-dtf9k\" (UID: \"4777d78f-48e2-4dae-a030-9b89449686e1\") " pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.172256 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4777d78f-48e2-4dae-a030-9b89449686e1-client-ca\") pod \"controller-manager-784c9756bd-dtf9k\" (UID: \"4777d78f-48e2-4dae-a030-9b89449686e1\") " pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.176452 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4777d78f-48e2-4dae-a030-9b89449686e1-serving-cert\") pod \"controller-manager-784c9756bd-dtf9k\" (UID: \"4777d78f-48e2-4dae-a030-9b89449686e1\") " pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.187586 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-559w4\" (UniqueName: \"kubernetes.io/projected/4777d78f-48e2-4dae-a030-9b89449686e1-kube-api-access-559w4\") pod \"controller-manager-784c9756bd-dtf9k\" (UID: \"4777d78f-48e2-4dae-a030-9b89449686e1\") " pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.290555 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.704710 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-784c9756bd-dtf9k"] Dec 02 20:03:23 crc kubenswrapper[4807]: I1202 20:03:23.946805 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" event={"ID":"4777d78f-48e2-4dae-a030-9b89449686e1","Type":"ContainerStarted","Data":"1409a33a6659d56d874532c1be3df211a8ca7f1e66265efaa3ab91258c6e6510"} Dec 02 20:03:25 crc kubenswrapper[4807]: I1202 20:03:25.963524 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" event={"ID":"4777d78f-48e2-4dae-a030-9b89449686e1","Type":"ContainerStarted","Data":"7dc1bba96e4fcd774462b8011e418a76a1387d24e60c24b631e3706bd04e0f34"} Dec 02 20:03:26 crc kubenswrapper[4807]: I1202 20:03:26.971366 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:26 crc kubenswrapper[4807]: I1202 20:03:26.981577 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" Dec 02 20:03:26 crc kubenswrapper[4807]: I1202 20:03:26.997654 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-784c9756bd-dtf9k" podStartSLOduration=8.997636773 podStartE2EDuration="8.997636773s" podCreationTimestamp="2025-12-02 20:03:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:03:26.991455828 +0000 UTC m=+342.292363343" watchObservedRunningTime="2025-12-02 20:03:26.997636773 +0000 UTC m=+342.298544258" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.404862 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4dl7k"] Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.406389 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.422541 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4dl7k"] Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.552957 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/52da9985-c847-4051-a9c2-036fbdb7a2f6-registry-certificates\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.553028 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/52da9985-c847-4051-a9c2-036fbdb7a2f6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.553056 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52da9985-c847-4051-a9c2-036fbdb7a2f6-registry-tls\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.553099 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52da9985-c847-4051-a9c2-036fbdb7a2f6-trusted-ca\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.553227 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf8zr\" (UniqueName: \"kubernetes.io/projected/52da9985-c847-4051-a9c2-036fbdb7a2f6-kube-api-access-rf8zr\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.553266 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.553296 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/52da9985-c847-4051-a9c2-036fbdb7a2f6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.553337 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52da9985-c847-4051-a9c2-036fbdb7a2f6-bound-sa-token\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.583074 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.654125 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/52da9985-c847-4051-a9c2-036fbdb7a2f6-registry-certificates\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.654189 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/52da9985-c847-4051-a9c2-036fbdb7a2f6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.654212 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52da9985-c847-4051-a9c2-036fbdb7a2f6-registry-tls\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.654238 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52da9985-c847-4051-a9c2-036fbdb7a2f6-trusted-ca\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.654277 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf8zr\" (UniqueName: \"kubernetes.io/projected/52da9985-c847-4051-a9c2-036fbdb7a2f6-kube-api-access-rf8zr\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.654317 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/52da9985-c847-4051-a9c2-036fbdb7a2f6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.654340 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52da9985-c847-4051-a9c2-036fbdb7a2f6-bound-sa-token\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.654894 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/52da9985-c847-4051-a9c2-036fbdb7a2f6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.655851 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/52da9985-c847-4051-a9c2-036fbdb7a2f6-registry-certificates\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.656545 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/52da9985-c847-4051-a9c2-036fbdb7a2f6-trusted-ca\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.663734 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/52da9985-c847-4051-a9c2-036fbdb7a2f6-registry-tls\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.669357 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/52da9985-c847-4051-a9c2-036fbdb7a2f6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.693388 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/52da9985-c847-4051-a9c2-036fbdb7a2f6-bound-sa-token\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.697518 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf8zr\" (UniqueName: \"kubernetes.io/projected/52da9985-c847-4051-a9c2-036fbdb7a2f6-kube-api-access-rf8zr\") pod \"image-registry-66df7c8f76-4dl7k\" (UID: \"52da9985-c847-4051-a9c2-036fbdb7a2f6\") " pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:45 crc kubenswrapper[4807]: I1202 20:03:45.725052 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:46 crc kubenswrapper[4807]: I1202 20:03:46.130911 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4dl7k"] Dec 02 20:03:47 crc kubenswrapper[4807]: I1202 20:03:47.098888 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" event={"ID":"52da9985-c847-4051-a9c2-036fbdb7a2f6","Type":"ContainerStarted","Data":"f3d607c0790700fac7a27d293a6153e680479925d9a72fc9edd3a5be3f3ae902"} Dec 02 20:03:49 crc kubenswrapper[4807]: I1202 20:03:49.114738 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" event={"ID":"52da9985-c847-4051-a9c2-036fbdb7a2f6","Type":"ContainerStarted","Data":"302d0cdd885eed9990931b87969ab219a6d95e43629e86577361c8fe714e34ea"} Dec 02 20:03:49 crc kubenswrapper[4807]: I1202 20:03:49.115509 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:03:49 crc kubenswrapper[4807]: I1202 20:03:49.145252 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" podStartSLOduration=4.14522934 podStartE2EDuration="4.14522934s" podCreationTimestamp="2025-12-02 20:03:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:03:49.141183365 +0000 UTC m=+364.442090870" watchObservedRunningTime="2025-12-02 20:03:49.14522934 +0000 UTC m=+364.446136845" Dec 02 20:03:58 crc kubenswrapper[4807]: I1202 20:03:58.293232 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:03:58 crc kubenswrapper[4807]: I1202 20:03:58.293900 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:04:05 crc kubenswrapper[4807]: I1202 20:04:05.731936 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-4dl7k" Dec 02 20:04:05 crc kubenswrapper[4807]: I1202 20:04:05.779931 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jh4tx"] Dec 02 20:04:28 crc kubenswrapper[4807]: I1202 20:04:28.293439 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:04:28 crc kubenswrapper[4807]: I1202 20:04:28.294128 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:04:30 crc kubenswrapper[4807]: I1202 20:04:30.824153 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" podUID="a8f3efef-a103-4813-94ed-1c9bd0113f84" containerName="registry" containerID="cri-o://bc30733a1c9196bc6cb498f46ac3079719bdecc3dee95d5b91a27d31084fe34c" gracePeriod=30 Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.744031 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.850141 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-registry-tls\") pod \"a8f3efef-a103-4813-94ed-1c9bd0113f84\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.850883 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8f3efef-a103-4813-94ed-1c9bd0113f84-installation-pull-secrets\") pod \"a8f3efef-a103-4813-94ed-1c9bd0113f84\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.851183 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7mbs\" (UniqueName: \"kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-kube-api-access-k7mbs\") pod \"a8f3efef-a103-4813-94ed-1c9bd0113f84\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.851455 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a8f3efef-a103-4813-94ed-1c9bd0113f84\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.851499 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8f3efef-a103-4813-94ed-1c9bd0113f84-ca-trust-extracted\") pod \"a8f3efef-a103-4813-94ed-1c9bd0113f84\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.851550 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8f3efef-a103-4813-94ed-1c9bd0113f84-registry-certificates\") pod \"a8f3efef-a103-4813-94ed-1c9bd0113f84\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.851602 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-bound-sa-token\") pod \"a8f3efef-a103-4813-94ed-1c9bd0113f84\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.851648 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8f3efef-a103-4813-94ed-1c9bd0113f84-trusted-ca\") pod \"a8f3efef-a103-4813-94ed-1c9bd0113f84\" (UID: \"a8f3efef-a103-4813-94ed-1c9bd0113f84\") " Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.852434 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8f3efef-a103-4813-94ed-1c9bd0113f84-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a8f3efef-a103-4813-94ed-1c9bd0113f84" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.852564 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8f3efef-a103-4813-94ed-1c9bd0113f84-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a8f3efef-a103-4813-94ed-1c9bd0113f84" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.858246 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a8f3efef-a103-4813-94ed-1c9bd0113f84" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.858498 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a8f3efef-a103-4813-94ed-1c9bd0113f84" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.859672 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-kube-api-access-k7mbs" (OuterVolumeSpecName: "kube-api-access-k7mbs") pod "a8f3efef-a103-4813-94ed-1c9bd0113f84" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84"). InnerVolumeSpecName "kube-api-access-k7mbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.859806 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8f3efef-a103-4813-94ed-1c9bd0113f84-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a8f3efef-a103-4813-94ed-1c9bd0113f84" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.862463 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a8f3efef-a103-4813-94ed-1c9bd0113f84" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.871198 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8f3efef-a103-4813-94ed-1c9bd0113f84-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a8f3efef-a103-4813-94ed-1c9bd0113f84" (UID: "a8f3efef-a103-4813-94ed-1c9bd0113f84"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.952998 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7mbs\" (UniqueName: \"kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-kube-api-access-k7mbs\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.953034 4807 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a8f3efef-a103-4813-94ed-1c9bd0113f84-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.953046 4807 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a8f3efef-a103-4813-94ed-1c9bd0113f84-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.953056 4807 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.953066 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8f3efef-a103-4813-94ed-1c9bd0113f84-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.953077 4807 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a8f3efef-a103-4813-94ed-1c9bd0113f84-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:31 crc kubenswrapper[4807]: I1202 20:04:31.953087 4807 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a8f3efef-a103-4813-94ed-1c9bd0113f84-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:32 crc kubenswrapper[4807]: I1202 20:04:32.372129 4807 generic.go:334] "Generic (PLEG): container finished" podID="a8f3efef-a103-4813-94ed-1c9bd0113f84" containerID="bc30733a1c9196bc6cb498f46ac3079719bdecc3dee95d5b91a27d31084fe34c" exitCode=0 Dec 02 20:04:32 crc kubenswrapper[4807]: I1202 20:04:32.372188 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" event={"ID":"a8f3efef-a103-4813-94ed-1c9bd0113f84","Type":"ContainerDied","Data":"bc30733a1c9196bc6cb498f46ac3079719bdecc3dee95d5b91a27d31084fe34c"} Dec 02 20:04:32 crc kubenswrapper[4807]: I1202 20:04:32.372229 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" event={"ID":"a8f3efef-a103-4813-94ed-1c9bd0113f84","Type":"ContainerDied","Data":"e18d639997782f9f1aa5f01c8d86e3c2bbd76edc065476bf43ac3dbeb378502f"} Dec 02 20:04:32 crc kubenswrapper[4807]: I1202 20:04:32.372234 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jh4tx" Dec 02 20:04:32 crc kubenswrapper[4807]: I1202 20:04:32.372251 4807 scope.go:117] "RemoveContainer" containerID="bc30733a1c9196bc6cb498f46ac3079719bdecc3dee95d5b91a27d31084fe34c" Dec 02 20:04:32 crc kubenswrapper[4807]: I1202 20:04:32.394760 4807 scope.go:117] "RemoveContainer" containerID="bc30733a1c9196bc6cb498f46ac3079719bdecc3dee95d5b91a27d31084fe34c" Dec 02 20:04:32 crc kubenswrapper[4807]: E1202 20:04:32.395349 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc30733a1c9196bc6cb498f46ac3079719bdecc3dee95d5b91a27d31084fe34c\": container with ID starting with bc30733a1c9196bc6cb498f46ac3079719bdecc3dee95d5b91a27d31084fe34c not found: ID does not exist" containerID="bc30733a1c9196bc6cb498f46ac3079719bdecc3dee95d5b91a27d31084fe34c" Dec 02 20:04:32 crc kubenswrapper[4807]: I1202 20:04:32.395407 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc30733a1c9196bc6cb498f46ac3079719bdecc3dee95d5b91a27d31084fe34c"} err="failed to get container status \"bc30733a1c9196bc6cb498f46ac3079719bdecc3dee95d5b91a27d31084fe34c\": rpc error: code = NotFound desc = could not find container \"bc30733a1c9196bc6cb498f46ac3079719bdecc3dee95d5b91a27d31084fe34c\": container with ID starting with bc30733a1c9196bc6cb498f46ac3079719bdecc3dee95d5b91a27d31084fe34c not found: ID does not exist" Dec 02 20:04:32 crc kubenswrapper[4807]: I1202 20:04:32.409185 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jh4tx"] Dec 02 20:04:32 crc kubenswrapper[4807]: I1202 20:04:32.412699 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jh4tx"] Dec 02 20:04:32 crc kubenswrapper[4807]: I1202 20:04:32.981110 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8f3efef-a103-4813-94ed-1c9bd0113f84" path="/var/lib/kubelet/pods/a8f3efef-a103-4813-94ed-1c9bd0113f84/volumes" Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.740144 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dc74n"] Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.741259 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dc74n" podUID="095ad74d-f5f1-44f3-9007-c779f4f06f62" containerName="registry-server" containerID="cri-o://47190befdc266a72d78ba4833808b6c46634a39515f1e84eba94015c02d8b96f" gracePeriod=30 Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.747385 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lmdbd"] Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.747675 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lmdbd" podUID="21184a59-8520-4dd0-b459-a056b42e852d" containerName="registry-server" containerID="cri-o://a92f4249ed10df51d6c96cd65f20b67547c3571bc4a738ba70cf0a4f0f0434f2" gracePeriod=30 Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.759046 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2c9sc"] Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.759441 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" podUID="b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c" containerName="marketplace-operator" containerID="cri-o://b3f53f284cf96205f4427e5b70fc2bc074e2ba7c47dfd741d752f28115a2d3bb" gracePeriod=30 Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.762876 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x9z86"] Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.763225 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x9z86" podUID="efbbb9d8-ccd1-40c9-a146-20dffe720203" containerName="registry-server" containerID="cri-o://65b8ff673a10a84f3060642c0d188b55ed72bc70ba45aa95b5e846cf784adf7e" gracePeriod=30 Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.770242 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9mxjq"] Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.771826 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9mxjq" podUID="40b1a65e-4886-466b-81ca-387c4b36310a" containerName="registry-server" containerID="cri-o://2669c4507c7283518d0553b314b82ae4b4683466028c8d6e73a2565afa58429f" gracePeriod=30 Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.784262 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xd6tg"] Dec 02 20:04:45 crc kubenswrapper[4807]: E1202 20:04:45.784495 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8f3efef-a103-4813-94ed-1c9bd0113f84" containerName="registry" Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.784508 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f3efef-a103-4813-94ed-1c9bd0113f84" containerName="registry" Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.784632 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8f3efef-a103-4813-94ed-1c9bd0113f84" containerName="registry" Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.785067 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xd6tg" Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.807305 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xd6tg"] Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.943592 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ffb3e245-8658-45b6-b784-250cd6d34a93-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xd6tg\" (UID: \"ffb3e245-8658-45b6-b784-250cd6d34a93\") " pod="openshift-marketplace/marketplace-operator-79b997595-xd6tg" Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.943712 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtwj5\" (UniqueName: \"kubernetes.io/projected/ffb3e245-8658-45b6-b784-250cd6d34a93-kube-api-access-jtwj5\") pod \"marketplace-operator-79b997595-xd6tg\" (UID: \"ffb3e245-8658-45b6-b784-250cd6d34a93\") " pod="openshift-marketplace/marketplace-operator-79b997595-xd6tg" Dec 02 20:04:45 crc kubenswrapper[4807]: I1202 20:04:45.943899 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ffb3e245-8658-45b6-b784-250cd6d34a93-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xd6tg\" (UID: \"ffb3e245-8658-45b6-b784-250cd6d34a93\") " pod="openshift-marketplace/marketplace-operator-79b997595-xd6tg" Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.044889 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ffb3e245-8658-45b6-b784-250cd6d34a93-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xd6tg\" (UID: \"ffb3e245-8658-45b6-b784-250cd6d34a93\") " pod="openshift-marketplace/marketplace-operator-79b997595-xd6tg" Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.044996 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ffb3e245-8658-45b6-b784-250cd6d34a93-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xd6tg\" (UID: \"ffb3e245-8658-45b6-b784-250cd6d34a93\") " pod="openshift-marketplace/marketplace-operator-79b997595-xd6tg" Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.045025 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtwj5\" (UniqueName: \"kubernetes.io/projected/ffb3e245-8658-45b6-b784-250cd6d34a93-kube-api-access-jtwj5\") pod \"marketplace-operator-79b997595-xd6tg\" (UID: \"ffb3e245-8658-45b6-b784-250cd6d34a93\") " pod="openshift-marketplace/marketplace-operator-79b997595-xd6tg" Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.047763 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ffb3e245-8658-45b6-b784-250cd6d34a93-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xd6tg\" (UID: \"ffb3e245-8658-45b6-b784-250cd6d34a93\") " pod="openshift-marketplace/marketplace-operator-79b997595-xd6tg" Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.050937 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ffb3e245-8658-45b6-b784-250cd6d34a93-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xd6tg\" (UID: \"ffb3e245-8658-45b6-b784-250cd6d34a93\") " pod="openshift-marketplace/marketplace-operator-79b997595-xd6tg" Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.072356 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtwj5\" (UniqueName: \"kubernetes.io/projected/ffb3e245-8658-45b6-b784-250cd6d34a93-kube-api-access-jtwj5\") pod \"marketplace-operator-79b997595-xd6tg\" (UID: \"ffb3e245-8658-45b6-b784-250cd6d34a93\") " pod="openshift-marketplace/marketplace-operator-79b997595-xd6tg" Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.106205 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xd6tg" Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.362559 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xd6tg"] Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.470267 4807 generic.go:334] "Generic (PLEG): container finished" podID="40b1a65e-4886-466b-81ca-387c4b36310a" containerID="2669c4507c7283518d0553b314b82ae4b4683466028c8d6e73a2565afa58429f" exitCode=0 Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.470396 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mxjq" event={"ID":"40b1a65e-4886-466b-81ca-387c4b36310a","Type":"ContainerDied","Data":"2669c4507c7283518d0553b314b82ae4b4683466028c8d6e73a2565afa58429f"} Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.472451 4807 generic.go:334] "Generic (PLEG): container finished" podID="b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c" containerID="b3f53f284cf96205f4427e5b70fc2bc074e2ba7c47dfd741d752f28115a2d3bb" exitCode=0 Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.472552 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" event={"ID":"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c","Type":"ContainerDied","Data":"b3f53f284cf96205f4427e5b70fc2bc074e2ba7c47dfd741d752f28115a2d3bb"} Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.475200 4807 generic.go:334] "Generic (PLEG): container finished" podID="efbbb9d8-ccd1-40c9-a146-20dffe720203" containerID="65b8ff673a10a84f3060642c0d188b55ed72bc70ba45aa95b5e846cf784adf7e" exitCode=0 Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.475270 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x9z86" event={"ID":"efbbb9d8-ccd1-40c9-a146-20dffe720203","Type":"ContainerDied","Data":"65b8ff673a10a84f3060642c0d188b55ed72bc70ba45aa95b5e846cf784adf7e"} Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.478257 4807 generic.go:334] "Generic (PLEG): container finished" podID="21184a59-8520-4dd0-b459-a056b42e852d" containerID="a92f4249ed10df51d6c96cd65f20b67547c3571bc4a738ba70cf0a4f0f0434f2" exitCode=0 Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.478334 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmdbd" event={"ID":"21184a59-8520-4dd0-b459-a056b42e852d","Type":"ContainerDied","Data":"a92f4249ed10df51d6c96cd65f20b67547c3571bc4a738ba70cf0a4f0f0434f2"} Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.481175 4807 generic.go:334] "Generic (PLEG): container finished" podID="095ad74d-f5f1-44f3-9007-c779f4f06f62" containerID="47190befdc266a72d78ba4833808b6c46634a39515f1e84eba94015c02d8b96f" exitCode=0 Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.481222 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc74n" event={"ID":"095ad74d-f5f1-44f3-9007-c779f4f06f62","Type":"ContainerDied","Data":"47190befdc266a72d78ba4833808b6c46634a39515f1e84eba94015c02d8b96f"} Dec 02 20:04:46 crc kubenswrapper[4807]: I1202 20:04:46.482699 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xd6tg" event={"ID":"ffb3e245-8658-45b6-b784-250cd6d34a93","Type":"ContainerStarted","Data":"7cc10a177135036640e70371c4f5b71974f388ec5731b5602e22c88db8437644"} Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.176432 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.314032 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.321621 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.326740 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.336117 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.373176 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhtv5\" (UniqueName: \"kubernetes.io/projected/40b1a65e-4886-466b-81ca-387c4b36310a-kube-api-access-dhtv5\") pod \"40b1a65e-4886-466b-81ca-387c4b36310a\" (UID: \"40b1a65e-4886-466b-81ca-387c4b36310a\") " Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.373412 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40b1a65e-4886-466b-81ca-387c4b36310a-catalog-content\") pod \"40b1a65e-4886-466b-81ca-387c4b36310a\" (UID: \"40b1a65e-4886-466b-81ca-387c4b36310a\") " Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.373459 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40b1a65e-4886-466b-81ca-387c4b36310a-utilities\") pod \"40b1a65e-4886-466b-81ca-387c4b36310a\" (UID: \"40b1a65e-4886-466b-81ca-387c4b36310a\") " Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.374697 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40b1a65e-4886-466b-81ca-387c4b36310a-utilities" (OuterVolumeSpecName: "utilities") pod "40b1a65e-4886-466b-81ca-387c4b36310a" (UID: "40b1a65e-4886-466b-81ca-387c4b36310a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.385165 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40b1a65e-4886-466b-81ca-387c4b36310a-kube-api-access-dhtv5" (OuterVolumeSpecName: "kube-api-access-dhtv5") pod "40b1a65e-4886-466b-81ca-387c4b36310a" (UID: "40b1a65e-4886-466b-81ca-387c4b36310a"). InnerVolumeSpecName "kube-api-access-dhtv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.475171 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkbcd\" (UniqueName: \"kubernetes.io/projected/21184a59-8520-4dd0-b459-a056b42e852d-kube-api-access-jkbcd\") pod \"21184a59-8520-4dd0-b459-a056b42e852d\" (UID: \"21184a59-8520-4dd0-b459-a056b42e852d\") " Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.475248 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-marketplace-operator-metrics\") pod \"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c\" (UID: \"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c\") " Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.475285 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21184a59-8520-4dd0-b459-a056b42e852d-catalog-content\") pod \"21184a59-8520-4dd0-b459-a056b42e852d\" (UID: \"21184a59-8520-4dd0-b459-a056b42e852d\") " Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.475341 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efbbb9d8-ccd1-40c9-a146-20dffe720203-catalog-content\") pod \"efbbb9d8-ccd1-40c9-a146-20dffe720203\" (UID: \"efbbb9d8-ccd1-40c9-a146-20dffe720203\") " Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.475376 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8thbr\" (UniqueName: \"kubernetes.io/projected/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-kube-api-access-8thbr\") pod \"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c\" (UID: \"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c\") " Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.475410 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efbbb9d8-ccd1-40c9-a146-20dffe720203-utilities\") pod \"efbbb9d8-ccd1-40c9-a146-20dffe720203\" (UID: \"efbbb9d8-ccd1-40c9-a146-20dffe720203\") " Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.475443 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4kpp\" (UniqueName: \"kubernetes.io/projected/efbbb9d8-ccd1-40c9-a146-20dffe720203-kube-api-access-j4kpp\") pod \"efbbb9d8-ccd1-40c9-a146-20dffe720203\" (UID: \"efbbb9d8-ccd1-40c9-a146-20dffe720203\") " Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.475478 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21184a59-8520-4dd0-b459-a056b42e852d-utilities\") pod \"21184a59-8520-4dd0-b459-a056b42e852d\" (UID: \"21184a59-8520-4dd0-b459-a056b42e852d\") " Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.475508 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-marketplace-trusted-ca\") pod \"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c\" (UID: \"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c\") " Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.475534 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/095ad74d-f5f1-44f3-9007-c779f4f06f62-catalog-content\") pod \"095ad74d-f5f1-44f3-9007-c779f4f06f62\" (UID: \"095ad74d-f5f1-44f3-9007-c779f4f06f62\") " Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.475568 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjk9k\" (UniqueName: \"kubernetes.io/projected/095ad74d-f5f1-44f3-9007-c779f4f06f62-kube-api-access-wjk9k\") pod \"095ad74d-f5f1-44f3-9007-c779f4f06f62\" (UID: \"095ad74d-f5f1-44f3-9007-c779f4f06f62\") " Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.475647 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/095ad74d-f5f1-44f3-9007-c779f4f06f62-utilities\") pod \"095ad74d-f5f1-44f3-9007-c779f4f06f62\" (UID: \"095ad74d-f5f1-44f3-9007-c779f4f06f62\") " Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.476006 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40b1a65e-4886-466b-81ca-387c4b36310a-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.476025 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhtv5\" (UniqueName: \"kubernetes.io/projected/40b1a65e-4886-466b-81ca-387c4b36310a-kube-api-access-dhtv5\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.477865 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21184a59-8520-4dd0-b459-a056b42e852d-utilities" (OuterVolumeSpecName: "utilities") pod "21184a59-8520-4dd0-b459-a056b42e852d" (UID: "21184a59-8520-4dd0-b459-a056b42e852d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.477899 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/095ad74d-f5f1-44f3-9007-c779f4f06f62-utilities" (OuterVolumeSpecName: "utilities") pod "095ad74d-f5f1-44f3-9007-c779f4f06f62" (UID: "095ad74d-f5f1-44f3-9007-c779f4f06f62"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.477983 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c" (UID: "b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.478245 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efbbb9d8-ccd1-40c9-a146-20dffe720203-utilities" (OuterVolumeSpecName: "utilities") pod "efbbb9d8-ccd1-40c9-a146-20dffe720203" (UID: "efbbb9d8-ccd1-40c9-a146-20dffe720203"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.482520 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c" (UID: "b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.483147 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efbbb9d8-ccd1-40c9-a146-20dffe720203-kube-api-access-j4kpp" (OuterVolumeSpecName: "kube-api-access-j4kpp") pod "efbbb9d8-ccd1-40c9-a146-20dffe720203" (UID: "efbbb9d8-ccd1-40c9-a146-20dffe720203"). InnerVolumeSpecName "kube-api-access-j4kpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.486056 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21184a59-8520-4dd0-b459-a056b42e852d-kube-api-access-jkbcd" (OuterVolumeSpecName: "kube-api-access-jkbcd") pod "21184a59-8520-4dd0-b459-a056b42e852d" (UID: "21184a59-8520-4dd0-b459-a056b42e852d"). InnerVolumeSpecName "kube-api-access-jkbcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.487304 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-kube-api-access-8thbr" (OuterVolumeSpecName: "kube-api-access-8thbr") pod "b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c" (UID: "b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c"). InnerVolumeSpecName "kube-api-access-8thbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.487079 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/095ad74d-f5f1-44f3-9007-c779f4f06f62-kube-api-access-wjk9k" (OuterVolumeSpecName: "kube-api-access-wjk9k") pod "095ad74d-f5f1-44f3-9007-c779f4f06f62" (UID: "095ad74d-f5f1-44f3-9007-c779f4f06f62"). InnerVolumeSpecName "kube-api-access-wjk9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.492069 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xd6tg" event={"ID":"ffb3e245-8658-45b6-b784-250cd6d34a93","Type":"ContainerStarted","Data":"2b6def3c28feaefeef10654ae414fe97bb0bb4e97572fae96024ee21b909e52b"} Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.493522 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xd6tg" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.499726 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mxjq" event={"ID":"40b1a65e-4886-466b-81ca-387c4b36310a","Type":"ContainerDied","Data":"147ef5c89da0f34da202ad96035e01ed08d4dd129fbd984cc174d24612deddb8"} Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.499820 4807 scope.go:117] "RemoveContainer" containerID="2669c4507c7283518d0553b314b82ae4b4683466028c8d6e73a2565afa58429f" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.499965 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9mxjq" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.501227 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efbbb9d8-ccd1-40c9-a146-20dffe720203-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "efbbb9d8-ccd1-40c9-a146-20dffe720203" (UID: "efbbb9d8-ccd1-40c9-a146-20dffe720203"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.504699 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40b1a65e-4886-466b-81ca-387c4b36310a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40b1a65e-4886-466b-81ca-387c4b36310a" (UID: "40b1a65e-4886-466b-81ca-387c4b36310a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.505319 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.505841 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2c9sc" event={"ID":"b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c","Type":"ContainerDied","Data":"7fee9f323894aabc80d70b6efd6262707b81c986b82e99c8cd79d022dc229e88"} Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.508863 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xd6tg" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.521930 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xd6tg" podStartSLOduration=2.521896675 podStartE2EDuration="2.521896675s" podCreationTimestamp="2025-12-02 20:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:04:47.518444679 +0000 UTC m=+422.819352174" watchObservedRunningTime="2025-12-02 20:04:47.521896675 +0000 UTC m=+422.822804160" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.525051 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x9z86" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.525377 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x9z86" event={"ID":"efbbb9d8-ccd1-40c9-a146-20dffe720203","Type":"ContainerDied","Data":"119dc2828f7a6d180ea30edb3377d13e95b9421260e9739f816fa27bfac0a24d"} Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.530063 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lmdbd" event={"ID":"21184a59-8520-4dd0-b459-a056b42e852d","Type":"ContainerDied","Data":"16567e25bb71449b6db8e18973b71a7094f024e0f93f8d53132b0377518a795b"} Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.530151 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lmdbd" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.540212 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dc74n" event={"ID":"095ad74d-f5f1-44f3-9007-c779f4f06f62","Type":"ContainerDied","Data":"0ce1fd7e3b3af0c81c9c535717cd9de387d979b6f24c942370e998b792da5e68"} Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.540454 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dc74n" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.554000 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21184a59-8520-4dd0-b459-a056b42e852d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21184a59-8520-4dd0-b459-a056b42e852d" (UID: "21184a59-8520-4dd0-b459-a056b42e852d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.562962 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/095ad74d-f5f1-44f3-9007-c779f4f06f62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "095ad74d-f5f1-44f3-9007-c779f4f06f62" (UID: "095ad74d-f5f1-44f3-9007-c779f4f06f62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.567211 4807 scope.go:117] "RemoveContainer" containerID="b2f9b345ea1b984bcaf053e4f9d206fc1a16460613e5d8b24745a5a58c2af6a9" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.578261 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4kpp\" (UniqueName: \"kubernetes.io/projected/efbbb9d8-ccd1-40c9-a146-20dffe720203-kube-api-access-j4kpp\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.578299 4807 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.578317 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21184a59-8520-4dd0-b459-a056b42e852d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.578330 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/095ad74d-f5f1-44f3-9007-c779f4f06f62-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.578341 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjk9k\" (UniqueName: \"kubernetes.io/projected/095ad74d-f5f1-44f3-9007-c779f4f06f62-kube-api-access-wjk9k\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.578350 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40b1a65e-4886-466b-81ca-387c4b36310a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.578363 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/095ad74d-f5f1-44f3-9007-c779f4f06f62-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.578372 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkbcd\" (UniqueName: \"kubernetes.io/projected/21184a59-8520-4dd0-b459-a056b42e852d-kube-api-access-jkbcd\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.578381 4807 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.578390 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21184a59-8520-4dd0-b459-a056b42e852d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.578401 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efbbb9d8-ccd1-40c9-a146-20dffe720203-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.578411 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8thbr\" (UniqueName: \"kubernetes.io/projected/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c-kube-api-access-8thbr\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.578419 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efbbb9d8-ccd1-40c9-a146-20dffe720203-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.592351 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2c9sc"] Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.594445 4807 scope.go:117] "RemoveContainer" containerID="d56c820b0a0f8645d1976a3f89429956da3bcb636b45cb2ea26acd3a691bf1c3" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.596136 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2c9sc"] Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.621697 4807 scope.go:117] "RemoveContainer" containerID="b3f53f284cf96205f4427e5b70fc2bc074e2ba7c47dfd741d752f28115a2d3bb" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.622816 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x9z86"] Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.627857 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x9z86"] Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.640288 4807 scope.go:117] "RemoveContainer" containerID="65b8ff673a10a84f3060642c0d188b55ed72bc70ba45aa95b5e846cf784adf7e" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.653340 4807 scope.go:117] "RemoveContainer" containerID="fd1c6b4f67d95230f03d9d1ffc859379359a05cc5213e20d1bba599e9a6a20df" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.667530 4807 scope.go:117] "RemoveContainer" containerID="facc3a7f6d40e75813592078a8cced23d3589b74324a3cd6b0fde668b63503eb" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.684167 4807 scope.go:117] "RemoveContainer" containerID="a92f4249ed10df51d6c96cd65f20b67547c3571bc4a738ba70cf0a4f0f0434f2" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.701761 4807 scope.go:117] "RemoveContainer" containerID="fea6d7a742e85bb055296aae03bed7991cd4c0562259a20f906f4717857a339d" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.717659 4807 scope.go:117] "RemoveContainer" containerID="bde1a117bdec2e795c46c481b632482a9e427cbd22e34d186220a4ed36c094cf" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.733931 4807 scope.go:117] "RemoveContainer" containerID="47190befdc266a72d78ba4833808b6c46634a39515f1e84eba94015c02d8b96f" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.750174 4807 scope.go:117] "RemoveContainer" containerID="1d2531aee2690b81e86eba992d1c64080cc10e58392d8f7821438fe36251a555" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.765211 4807 scope.go:117] "RemoveContainer" containerID="f8103470c6692017c6e127fc486d5130985f6e37c306bf11625fd3d4a3626348" Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.831046 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9mxjq"] Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.835259 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9mxjq"] Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.866522 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lmdbd"] Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.869848 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lmdbd"] Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.901457 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dc74n"] Dec 02 20:04:47 crc kubenswrapper[4807]: I1202 20:04:47.905443 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dc74n"] Dec 02 20:04:48 crc kubenswrapper[4807]: I1202 20:04:48.980128 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="095ad74d-f5f1-44f3-9007-c779f4f06f62" path="/var/lib/kubelet/pods/095ad74d-f5f1-44f3-9007-c779f4f06f62/volumes" Dec 02 20:04:48 crc kubenswrapper[4807]: I1202 20:04:48.981801 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21184a59-8520-4dd0-b459-a056b42e852d" path="/var/lib/kubelet/pods/21184a59-8520-4dd0-b459-a056b42e852d/volumes" Dec 02 20:04:48 crc kubenswrapper[4807]: I1202 20:04:48.982593 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40b1a65e-4886-466b-81ca-387c4b36310a" path="/var/lib/kubelet/pods/40b1a65e-4886-466b-81ca-387c4b36310a/volumes" Dec 02 20:04:48 crc kubenswrapper[4807]: I1202 20:04:48.983856 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c" path="/var/lib/kubelet/pods/b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c/volumes" Dec 02 20:04:48 crc kubenswrapper[4807]: I1202 20:04:48.984419 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efbbb9d8-ccd1-40c9-a146-20dffe720203" path="/var/lib/kubelet/pods/efbbb9d8-ccd1-40c9-a146-20dffe720203/volumes" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948064 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-khjdb"] Dec 02 20:04:49 crc kubenswrapper[4807]: E1202 20:04:49.948317 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b1a65e-4886-466b-81ca-387c4b36310a" containerName="registry-server" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948330 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b1a65e-4886-466b-81ca-387c4b36310a" containerName="registry-server" Dec 02 20:04:49 crc kubenswrapper[4807]: E1202 20:04:49.948346 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c" containerName="marketplace-operator" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948352 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c" containerName="marketplace-operator" Dec 02 20:04:49 crc kubenswrapper[4807]: E1202 20:04:49.948364 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095ad74d-f5f1-44f3-9007-c779f4f06f62" containerName="extract-content" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948370 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="095ad74d-f5f1-44f3-9007-c779f4f06f62" containerName="extract-content" Dec 02 20:04:49 crc kubenswrapper[4807]: E1202 20:04:49.948379 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efbbb9d8-ccd1-40c9-a146-20dffe720203" containerName="extract-content" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948385 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="efbbb9d8-ccd1-40c9-a146-20dffe720203" containerName="extract-content" Dec 02 20:04:49 crc kubenswrapper[4807]: E1202 20:04:49.948394 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095ad74d-f5f1-44f3-9007-c779f4f06f62" containerName="extract-utilities" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948403 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="095ad74d-f5f1-44f3-9007-c779f4f06f62" containerName="extract-utilities" Dec 02 20:04:49 crc kubenswrapper[4807]: E1202 20:04:49.948412 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095ad74d-f5f1-44f3-9007-c779f4f06f62" containerName="registry-server" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948420 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="095ad74d-f5f1-44f3-9007-c779f4f06f62" containerName="registry-server" Dec 02 20:04:49 crc kubenswrapper[4807]: E1202 20:04:49.948432 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efbbb9d8-ccd1-40c9-a146-20dffe720203" containerName="extract-utilities" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948441 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="efbbb9d8-ccd1-40c9-a146-20dffe720203" containerName="extract-utilities" Dec 02 20:04:49 crc kubenswrapper[4807]: E1202 20:04:49.948452 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b1a65e-4886-466b-81ca-387c4b36310a" containerName="extract-utilities" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948459 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b1a65e-4886-466b-81ca-387c4b36310a" containerName="extract-utilities" Dec 02 20:04:49 crc kubenswrapper[4807]: E1202 20:04:49.948503 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efbbb9d8-ccd1-40c9-a146-20dffe720203" containerName="registry-server" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948511 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="efbbb9d8-ccd1-40c9-a146-20dffe720203" containerName="registry-server" Dec 02 20:04:49 crc kubenswrapper[4807]: E1202 20:04:49.948527 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21184a59-8520-4dd0-b459-a056b42e852d" containerName="extract-content" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948536 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="21184a59-8520-4dd0-b459-a056b42e852d" containerName="extract-content" Dec 02 20:04:49 crc kubenswrapper[4807]: E1202 20:04:49.948551 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b1a65e-4886-466b-81ca-387c4b36310a" containerName="extract-content" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948558 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b1a65e-4886-466b-81ca-387c4b36310a" containerName="extract-content" Dec 02 20:04:49 crc kubenswrapper[4807]: E1202 20:04:49.948569 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21184a59-8520-4dd0-b459-a056b42e852d" containerName="extract-utilities" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948578 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="21184a59-8520-4dd0-b459-a056b42e852d" containerName="extract-utilities" Dec 02 20:04:49 crc kubenswrapper[4807]: E1202 20:04:49.948587 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21184a59-8520-4dd0-b459-a056b42e852d" containerName="registry-server" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948595 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="21184a59-8520-4dd0-b459-a056b42e852d" containerName="registry-server" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948749 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="095ad74d-f5f1-44f3-9007-c779f4f06f62" containerName="registry-server" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948764 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b1a65e-4886-466b-81ca-387c4b36310a" containerName="registry-server" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948772 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b610ced8-c5b0-49d3-a3cd-1aa082f0dd6c" containerName="marketplace-operator" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948784 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="efbbb9d8-ccd1-40c9-a146-20dffe720203" containerName="registry-server" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.948798 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="21184a59-8520-4dd0-b459-a056b42e852d" containerName="registry-server" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.949708 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khjdb" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.954037 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 20:04:49 crc kubenswrapper[4807]: I1202 20:04:49.967491 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khjdb"] Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.014980 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjrdv\" (UniqueName: \"kubernetes.io/projected/b1e5d8f8-0730-44b0-beb7-652e4b9461bd-kube-api-access-mjrdv\") pod \"community-operators-khjdb\" (UID: \"b1e5d8f8-0730-44b0-beb7-652e4b9461bd\") " pod="openshift-marketplace/community-operators-khjdb" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.015041 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1e5d8f8-0730-44b0-beb7-652e4b9461bd-utilities\") pod \"community-operators-khjdb\" (UID: \"b1e5d8f8-0730-44b0-beb7-652e4b9461bd\") " pod="openshift-marketplace/community-operators-khjdb" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.015072 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1e5d8f8-0730-44b0-beb7-652e4b9461bd-catalog-content\") pod \"community-operators-khjdb\" (UID: \"b1e5d8f8-0730-44b0-beb7-652e4b9461bd\") " pod="openshift-marketplace/community-operators-khjdb" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.115845 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjrdv\" (UniqueName: \"kubernetes.io/projected/b1e5d8f8-0730-44b0-beb7-652e4b9461bd-kube-api-access-mjrdv\") pod \"community-operators-khjdb\" (UID: \"b1e5d8f8-0730-44b0-beb7-652e4b9461bd\") " pod="openshift-marketplace/community-operators-khjdb" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.115915 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1e5d8f8-0730-44b0-beb7-652e4b9461bd-utilities\") pod \"community-operators-khjdb\" (UID: \"b1e5d8f8-0730-44b0-beb7-652e4b9461bd\") " pod="openshift-marketplace/community-operators-khjdb" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.115947 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1e5d8f8-0730-44b0-beb7-652e4b9461bd-catalog-content\") pod \"community-operators-khjdb\" (UID: \"b1e5d8f8-0730-44b0-beb7-652e4b9461bd\") " pod="openshift-marketplace/community-operators-khjdb" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.116877 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1e5d8f8-0730-44b0-beb7-652e4b9461bd-utilities\") pod \"community-operators-khjdb\" (UID: \"b1e5d8f8-0730-44b0-beb7-652e4b9461bd\") " pod="openshift-marketplace/community-operators-khjdb" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.116914 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1e5d8f8-0730-44b0-beb7-652e4b9461bd-catalog-content\") pod \"community-operators-khjdb\" (UID: \"b1e5d8f8-0730-44b0-beb7-652e4b9461bd\") " pod="openshift-marketplace/community-operators-khjdb" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.141491 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjrdv\" (UniqueName: \"kubernetes.io/projected/b1e5d8f8-0730-44b0-beb7-652e4b9461bd-kube-api-access-mjrdv\") pod \"community-operators-khjdb\" (UID: \"b1e5d8f8-0730-44b0-beb7-652e4b9461bd\") " pod="openshift-marketplace/community-operators-khjdb" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.167706 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qfvxm"] Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.173399 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qfvxm" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.179935 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.182974 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qfvxm"] Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.281187 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khjdb" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.319153 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z57fn\" (UniqueName: \"kubernetes.io/projected/c593c7c1-9bcc-4c52-92de-818b1cae7d51-kube-api-access-z57fn\") pod \"certified-operators-qfvxm\" (UID: \"c593c7c1-9bcc-4c52-92de-818b1cae7d51\") " pod="openshift-marketplace/certified-operators-qfvxm" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.319637 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c593c7c1-9bcc-4c52-92de-818b1cae7d51-catalog-content\") pod \"certified-operators-qfvxm\" (UID: \"c593c7c1-9bcc-4c52-92de-818b1cae7d51\") " pod="openshift-marketplace/certified-operators-qfvxm" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.319701 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c593c7c1-9bcc-4c52-92de-818b1cae7d51-utilities\") pod \"certified-operators-qfvxm\" (UID: \"c593c7c1-9bcc-4c52-92de-818b1cae7d51\") " pod="openshift-marketplace/certified-operators-qfvxm" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.420758 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z57fn\" (UniqueName: \"kubernetes.io/projected/c593c7c1-9bcc-4c52-92de-818b1cae7d51-kube-api-access-z57fn\") pod \"certified-operators-qfvxm\" (UID: \"c593c7c1-9bcc-4c52-92de-818b1cae7d51\") " pod="openshift-marketplace/certified-operators-qfvxm" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.420860 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c593c7c1-9bcc-4c52-92de-818b1cae7d51-catalog-content\") pod \"certified-operators-qfvxm\" (UID: \"c593c7c1-9bcc-4c52-92de-818b1cae7d51\") " pod="openshift-marketplace/certified-operators-qfvxm" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.420900 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c593c7c1-9bcc-4c52-92de-818b1cae7d51-utilities\") pod \"certified-operators-qfvxm\" (UID: \"c593c7c1-9bcc-4c52-92de-818b1cae7d51\") " pod="openshift-marketplace/certified-operators-qfvxm" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.422271 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c593c7c1-9bcc-4c52-92de-818b1cae7d51-utilities\") pod \"certified-operators-qfvxm\" (UID: \"c593c7c1-9bcc-4c52-92de-818b1cae7d51\") " pod="openshift-marketplace/certified-operators-qfvxm" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.423775 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c593c7c1-9bcc-4c52-92de-818b1cae7d51-catalog-content\") pod \"certified-operators-qfvxm\" (UID: \"c593c7c1-9bcc-4c52-92de-818b1cae7d51\") " pod="openshift-marketplace/certified-operators-qfvxm" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.448481 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z57fn\" (UniqueName: \"kubernetes.io/projected/c593c7c1-9bcc-4c52-92de-818b1cae7d51-kube-api-access-z57fn\") pod \"certified-operators-qfvxm\" (UID: \"c593c7c1-9bcc-4c52-92de-818b1cae7d51\") " pod="openshift-marketplace/certified-operators-qfvxm" Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.494238 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khjdb"] Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.497919 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qfvxm" Dec 02 20:04:50 crc kubenswrapper[4807]: W1202 20:04:50.506776 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1e5d8f8_0730_44b0_beb7_652e4b9461bd.slice/crio-dc74e03fbd02683a97e53148822438f190b728df53214a87500dcfb8f79f1516 WatchSource:0}: Error finding container dc74e03fbd02683a97e53148822438f190b728df53214a87500dcfb8f79f1516: Status 404 returned error can't find the container with id dc74e03fbd02683a97e53148822438f190b728df53214a87500dcfb8f79f1516 Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.569053 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khjdb" event={"ID":"b1e5d8f8-0730-44b0-beb7-652e4b9461bd","Type":"ContainerStarted","Data":"dc74e03fbd02683a97e53148822438f190b728df53214a87500dcfb8f79f1516"} Dec 02 20:04:50 crc kubenswrapper[4807]: I1202 20:04:50.922499 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qfvxm"] Dec 02 20:04:51 crc kubenswrapper[4807]: I1202 20:04:51.577515 4807 generic.go:334] "Generic (PLEG): container finished" podID="b1e5d8f8-0730-44b0-beb7-652e4b9461bd" containerID="a096a913b94618d3259e795731108f7c3c30817280f8744f59a642e2e21b9c62" exitCode=0 Dec 02 20:04:51 crc kubenswrapper[4807]: I1202 20:04:51.577600 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khjdb" event={"ID":"b1e5d8f8-0730-44b0-beb7-652e4b9461bd","Type":"ContainerDied","Data":"a096a913b94618d3259e795731108f7c3c30817280f8744f59a642e2e21b9c62"} Dec 02 20:04:51 crc kubenswrapper[4807]: I1202 20:04:51.581669 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfvxm" event={"ID":"c593c7c1-9bcc-4c52-92de-818b1cae7d51","Type":"ContainerDied","Data":"4bfddb7e035e4cb24be0ba5dfb5088bb9719f852967b544169e489aa1ef2af7e"} Dec 02 20:04:51 crc kubenswrapper[4807]: I1202 20:04:51.581507 4807 generic.go:334] "Generic (PLEG): container finished" podID="c593c7c1-9bcc-4c52-92de-818b1cae7d51" containerID="4bfddb7e035e4cb24be0ba5dfb5088bb9719f852967b544169e489aa1ef2af7e" exitCode=0 Dec 02 20:04:51 crc kubenswrapper[4807]: I1202 20:04:51.582565 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfvxm" event={"ID":"c593c7c1-9bcc-4c52-92de-818b1cae7d51","Type":"ContainerStarted","Data":"97b9dee46cee259fdc323bc259fb4c31e582bb30c50b88f2e2244df89e961872"} Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.348564 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jl5sd"] Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.350484 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jl5sd" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.354268 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.360007 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jl5sd"] Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.447597 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/658f9914-a1d3-4c38-a8fd-ef69123b8f0a-catalog-content\") pod \"redhat-marketplace-jl5sd\" (UID: \"658f9914-a1d3-4c38-a8fd-ef69123b8f0a\") " pod="openshift-marketplace/redhat-marketplace-jl5sd" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.447655 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmmnx\" (UniqueName: \"kubernetes.io/projected/658f9914-a1d3-4c38-a8fd-ef69123b8f0a-kube-api-access-rmmnx\") pod \"redhat-marketplace-jl5sd\" (UID: \"658f9914-a1d3-4c38-a8fd-ef69123b8f0a\") " pod="openshift-marketplace/redhat-marketplace-jl5sd" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.447730 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/658f9914-a1d3-4c38-a8fd-ef69123b8f0a-utilities\") pod \"redhat-marketplace-jl5sd\" (UID: \"658f9914-a1d3-4c38-a8fd-ef69123b8f0a\") " pod="openshift-marketplace/redhat-marketplace-jl5sd" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.549068 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/658f9914-a1d3-4c38-a8fd-ef69123b8f0a-catalog-content\") pod \"redhat-marketplace-jl5sd\" (UID: \"658f9914-a1d3-4c38-a8fd-ef69123b8f0a\") " pod="openshift-marketplace/redhat-marketplace-jl5sd" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.549116 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmmnx\" (UniqueName: \"kubernetes.io/projected/658f9914-a1d3-4c38-a8fd-ef69123b8f0a-kube-api-access-rmmnx\") pod \"redhat-marketplace-jl5sd\" (UID: \"658f9914-a1d3-4c38-a8fd-ef69123b8f0a\") " pod="openshift-marketplace/redhat-marketplace-jl5sd" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.549162 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/658f9914-a1d3-4c38-a8fd-ef69123b8f0a-utilities\") pod \"redhat-marketplace-jl5sd\" (UID: \"658f9914-a1d3-4c38-a8fd-ef69123b8f0a\") " pod="openshift-marketplace/redhat-marketplace-jl5sd" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.549945 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/658f9914-a1d3-4c38-a8fd-ef69123b8f0a-utilities\") pod \"redhat-marketplace-jl5sd\" (UID: \"658f9914-a1d3-4c38-a8fd-ef69123b8f0a\") " pod="openshift-marketplace/redhat-marketplace-jl5sd" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.550234 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/658f9914-a1d3-4c38-a8fd-ef69123b8f0a-catalog-content\") pod \"redhat-marketplace-jl5sd\" (UID: \"658f9914-a1d3-4c38-a8fd-ef69123b8f0a\") " pod="openshift-marketplace/redhat-marketplace-jl5sd" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.551863 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kpfcc"] Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.553072 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.559888 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kpfcc"] Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.562097 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.579264 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmmnx\" (UniqueName: \"kubernetes.io/projected/658f9914-a1d3-4c38-a8fd-ef69123b8f0a-kube-api-access-rmmnx\") pod \"redhat-marketplace-jl5sd\" (UID: \"658f9914-a1d3-4c38-a8fd-ef69123b8f0a\") " pod="openshift-marketplace/redhat-marketplace-jl5sd" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.589909 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfvxm" event={"ID":"c593c7c1-9bcc-4c52-92de-818b1cae7d51","Type":"ContainerStarted","Data":"d6c99c1279a1c3801f2ff259e155fe0b74ccae59cce33075ac309f87bbe38036"} Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.650916 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a430d81-93d6-44ac-b492-762898abc32c-utilities\") pod \"redhat-operators-kpfcc\" (UID: \"7a430d81-93d6-44ac-b492-762898abc32c\") " pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.650989 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlhg4\" (UniqueName: \"kubernetes.io/projected/7a430d81-93d6-44ac-b492-762898abc32c-kube-api-access-vlhg4\") pod \"redhat-operators-kpfcc\" (UID: \"7a430d81-93d6-44ac-b492-762898abc32c\") " pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.651036 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a430d81-93d6-44ac-b492-762898abc32c-catalog-content\") pod \"redhat-operators-kpfcc\" (UID: \"7a430d81-93d6-44ac-b492-762898abc32c\") " pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.690932 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jl5sd" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.751811 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a430d81-93d6-44ac-b492-762898abc32c-utilities\") pod \"redhat-operators-kpfcc\" (UID: \"7a430d81-93d6-44ac-b492-762898abc32c\") " pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.752274 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlhg4\" (UniqueName: \"kubernetes.io/projected/7a430d81-93d6-44ac-b492-762898abc32c-kube-api-access-vlhg4\") pod \"redhat-operators-kpfcc\" (UID: \"7a430d81-93d6-44ac-b492-762898abc32c\") " pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.752311 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a430d81-93d6-44ac-b492-762898abc32c-utilities\") pod \"redhat-operators-kpfcc\" (UID: \"7a430d81-93d6-44ac-b492-762898abc32c\") " pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.752597 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a430d81-93d6-44ac-b492-762898abc32c-catalog-content\") pod \"redhat-operators-kpfcc\" (UID: \"7a430d81-93d6-44ac-b492-762898abc32c\") " pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.752992 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a430d81-93d6-44ac-b492-762898abc32c-catalog-content\") pod \"redhat-operators-kpfcc\" (UID: \"7a430d81-93d6-44ac-b492-762898abc32c\") " pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.779814 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlhg4\" (UniqueName: \"kubernetes.io/projected/7a430d81-93d6-44ac-b492-762898abc32c-kube-api-access-vlhg4\") pod \"redhat-operators-kpfcc\" (UID: \"7a430d81-93d6-44ac-b492-762898abc32c\") " pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:04:52 crc kubenswrapper[4807]: I1202 20:04:52.874742 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:04:53 crc kubenswrapper[4807]: I1202 20:04:53.059395 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kpfcc"] Dec 02 20:04:53 crc kubenswrapper[4807]: W1202 20:04:53.067851 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a430d81_93d6_44ac_b492_762898abc32c.slice/crio-4f30abe756848ac46d4b5308ad7e7bd38f4cd1f9ff891efd0fd2710aa6e1349a WatchSource:0}: Error finding container 4f30abe756848ac46d4b5308ad7e7bd38f4cd1f9ff891efd0fd2710aa6e1349a: Status 404 returned error can't find the container with id 4f30abe756848ac46d4b5308ad7e7bd38f4cd1f9ff891efd0fd2710aa6e1349a Dec 02 20:04:53 crc kubenswrapper[4807]: I1202 20:04:53.090939 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jl5sd"] Dec 02 20:04:53 crc kubenswrapper[4807]: W1202 20:04:53.099424 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod658f9914_a1d3_4c38_a8fd_ef69123b8f0a.slice/crio-faf84b1395314fbbd132e6117c58cb8283f0315708e9425256e45e971f6536c8 WatchSource:0}: Error finding container faf84b1395314fbbd132e6117c58cb8283f0315708e9425256e45e971f6536c8: Status 404 returned error can't find the container with id faf84b1395314fbbd132e6117c58cb8283f0315708e9425256e45e971f6536c8 Dec 02 20:04:53 crc kubenswrapper[4807]: I1202 20:04:53.596229 4807 generic.go:334] "Generic (PLEG): container finished" podID="658f9914-a1d3-4c38-a8fd-ef69123b8f0a" containerID="3bc8cca112f12bfde1147cba55ba5e35cd29316d742f4c05e8ac879af2c60b90" exitCode=0 Dec 02 20:04:53 crc kubenswrapper[4807]: I1202 20:04:53.596368 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jl5sd" event={"ID":"658f9914-a1d3-4c38-a8fd-ef69123b8f0a","Type":"ContainerDied","Data":"3bc8cca112f12bfde1147cba55ba5e35cd29316d742f4c05e8ac879af2c60b90"} Dec 02 20:04:53 crc kubenswrapper[4807]: I1202 20:04:53.596698 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jl5sd" event={"ID":"658f9914-a1d3-4c38-a8fd-ef69123b8f0a","Type":"ContainerStarted","Data":"faf84b1395314fbbd132e6117c58cb8283f0315708e9425256e45e971f6536c8"} Dec 02 20:04:53 crc kubenswrapper[4807]: I1202 20:04:53.600408 4807 generic.go:334] "Generic (PLEG): container finished" podID="c593c7c1-9bcc-4c52-92de-818b1cae7d51" containerID="d6c99c1279a1c3801f2ff259e155fe0b74ccae59cce33075ac309f87bbe38036" exitCode=0 Dec 02 20:04:53 crc kubenswrapper[4807]: I1202 20:04:53.600513 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfvxm" event={"ID":"c593c7c1-9bcc-4c52-92de-818b1cae7d51","Type":"ContainerDied","Data":"d6c99c1279a1c3801f2ff259e155fe0b74ccae59cce33075ac309f87bbe38036"} Dec 02 20:04:53 crc kubenswrapper[4807]: I1202 20:04:53.603249 4807 generic.go:334] "Generic (PLEG): container finished" podID="7a430d81-93d6-44ac-b492-762898abc32c" containerID="83fa0c73d92c61be80bf6eace0cbf8fb07f4c5af838051eed16f909234846b50" exitCode=0 Dec 02 20:04:53 crc kubenswrapper[4807]: I1202 20:04:53.603336 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpfcc" event={"ID":"7a430d81-93d6-44ac-b492-762898abc32c","Type":"ContainerDied","Data":"83fa0c73d92c61be80bf6eace0cbf8fb07f4c5af838051eed16f909234846b50"} Dec 02 20:04:53 crc kubenswrapper[4807]: I1202 20:04:53.603377 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpfcc" event={"ID":"7a430d81-93d6-44ac-b492-762898abc32c","Type":"ContainerStarted","Data":"4f30abe756848ac46d4b5308ad7e7bd38f4cd1f9ff891efd0fd2710aa6e1349a"} Dec 02 20:04:53 crc kubenswrapper[4807]: I1202 20:04:53.605423 4807 generic.go:334] "Generic (PLEG): container finished" podID="b1e5d8f8-0730-44b0-beb7-652e4b9461bd" containerID="2de32d37f38f8a30d059b927c077263a94b2b1864986f4ec75eab5ec46ff5d4a" exitCode=0 Dec 02 20:04:53 crc kubenswrapper[4807]: I1202 20:04:53.605480 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khjdb" event={"ID":"b1e5d8f8-0730-44b0-beb7-652e4b9461bd","Type":"ContainerDied","Data":"2de32d37f38f8a30d059b927c077263a94b2b1864986f4ec75eab5ec46ff5d4a"} Dec 02 20:04:54 crc kubenswrapper[4807]: I1202 20:04:54.614594 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khjdb" event={"ID":"b1e5d8f8-0730-44b0-beb7-652e4b9461bd","Type":"ContainerStarted","Data":"73cc39014bd470898daba064102dc03ad50cf7a6d63b44d8f692c0a27e1f2124"} Dec 02 20:04:54 crc kubenswrapper[4807]: I1202 20:04:54.617043 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfvxm" event={"ID":"c593c7c1-9bcc-4c52-92de-818b1cae7d51","Type":"ContainerStarted","Data":"1b40ba192b1efb7ab333834d771f3c62dbc9cd80caa50d361deb1cd533390a97"} Dec 02 20:04:54 crc kubenswrapper[4807]: I1202 20:04:54.637357 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qfvxm" podStartSLOduration=2.197603936 podStartE2EDuration="4.637329555s" podCreationTimestamp="2025-12-02 20:04:50 +0000 UTC" firstStartedPulling="2025-12-02 20:04:51.582872192 +0000 UTC m=+426.883779687" lastFinishedPulling="2025-12-02 20:04:54.022597811 +0000 UTC m=+429.323505306" observedRunningTime="2025-12-02 20:04:54.633392356 +0000 UTC m=+429.934299861" watchObservedRunningTime="2025-12-02 20:04:54.637329555 +0000 UTC m=+429.938237050" Dec 02 20:04:55 crc kubenswrapper[4807]: I1202 20:04:55.646451 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-khjdb" podStartSLOduration=4.182394635 podStartE2EDuration="6.646427191s" podCreationTimestamp="2025-12-02 20:04:49 +0000 UTC" firstStartedPulling="2025-12-02 20:04:51.580892337 +0000 UTC m=+426.881799832" lastFinishedPulling="2025-12-02 20:04:54.044924893 +0000 UTC m=+429.345832388" observedRunningTime="2025-12-02 20:04:55.645889206 +0000 UTC m=+430.946796721" watchObservedRunningTime="2025-12-02 20:04:55.646427191 +0000 UTC m=+430.947334686" Dec 02 20:04:56 crc kubenswrapper[4807]: I1202 20:04:56.648309 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jl5sd" event={"ID":"658f9914-a1d3-4c38-a8fd-ef69123b8f0a","Type":"ContainerStarted","Data":"5dda77f32422b641eec7b9e3576ce5472758f9ca14bc570a9288968e0903e91d"} Dec 02 20:04:56 crc kubenswrapper[4807]: I1202 20:04:56.661065 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpfcc" event={"ID":"7a430d81-93d6-44ac-b492-762898abc32c","Type":"ContainerStarted","Data":"5f1b7f45acf334643a5fc2e178362676a655aebdb40edda944292e49893b39e0"} Dec 02 20:04:57 crc kubenswrapper[4807]: I1202 20:04:57.668912 4807 generic.go:334] "Generic (PLEG): container finished" podID="658f9914-a1d3-4c38-a8fd-ef69123b8f0a" containerID="5dda77f32422b641eec7b9e3576ce5472758f9ca14bc570a9288968e0903e91d" exitCode=0 Dec 02 20:04:57 crc kubenswrapper[4807]: I1202 20:04:57.669011 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jl5sd" event={"ID":"658f9914-a1d3-4c38-a8fd-ef69123b8f0a","Type":"ContainerDied","Data":"5dda77f32422b641eec7b9e3576ce5472758f9ca14bc570a9288968e0903e91d"} Dec 02 20:04:57 crc kubenswrapper[4807]: I1202 20:04:57.672183 4807 generic.go:334] "Generic (PLEG): container finished" podID="7a430d81-93d6-44ac-b492-762898abc32c" containerID="5f1b7f45acf334643a5fc2e178362676a655aebdb40edda944292e49893b39e0" exitCode=0 Dec 02 20:04:57 crc kubenswrapper[4807]: I1202 20:04:57.672221 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpfcc" event={"ID":"7a430d81-93d6-44ac-b492-762898abc32c","Type":"ContainerDied","Data":"5f1b7f45acf334643a5fc2e178362676a655aebdb40edda944292e49893b39e0"} Dec 02 20:04:58 crc kubenswrapper[4807]: I1202 20:04:58.293368 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:04:58 crc kubenswrapper[4807]: I1202 20:04:58.294052 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:04:58 crc kubenswrapper[4807]: I1202 20:04:58.294137 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 20:04:58 crc kubenswrapper[4807]: I1202 20:04:58.295026 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"921e128bf7c751c48abace6e4b39de2d8b371ebc3478e06c957ad3086674020b"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 20:04:58 crc kubenswrapper[4807]: I1202 20:04:58.295121 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://921e128bf7c751c48abace6e4b39de2d8b371ebc3478e06c957ad3086674020b" gracePeriod=600 Dec 02 20:04:59 crc kubenswrapper[4807]: I1202 20:04:59.699873 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpfcc" event={"ID":"7a430d81-93d6-44ac-b492-762898abc32c","Type":"ContainerStarted","Data":"8c398d41d1411cc386d3314cc90c7dc4404b318131f5f4ada44ed947a22218c2"} Dec 02 20:04:59 crc kubenswrapper[4807]: I1202 20:04:59.703680 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="921e128bf7c751c48abace6e4b39de2d8b371ebc3478e06c957ad3086674020b" exitCode=0 Dec 02 20:04:59 crc kubenswrapper[4807]: I1202 20:04:59.703780 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"921e128bf7c751c48abace6e4b39de2d8b371ebc3478e06c957ad3086674020b"} Dec 02 20:04:59 crc kubenswrapper[4807]: I1202 20:04:59.703873 4807 scope.go:117] "RemoveContainer" containerID="f4f6e314704558e06559f2a18a59b37da486a56043feaa7d9b9b472d48b31079" Dec 02 20:04:59 crc kubenswrapper[4807]: I1202 20:04:59.707263 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jl5sd" event={"ID":"658f9914-a1d3-4c38-a8fd-ef69123b8f0a","Type":"ContainerStarted","Data":"a01339d5c948decb8714de6466cf97e85b81e6b6d6f8a4ac9b71ec6c2c6b27ff"} Dec 02 20:04:59 crc kubenswrapper[4807]: I1202 20:04:59.747190 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kpfcc" podStartSLOduration=2.027041245 podStartE2EDuration="7.747170075s" podCreationTimestamp="2025-12-02 20:04:52 +0000 UTC" firstStartedPulling="2025-12-02 20:04:53.604764126 +0000 UTC m=+428.905671621" lastFinishedPulling="2025-12-02 20:04:59.324892956 +0000 UTC m=+434.625800451" observedRunningTime="2025-12-02 20:04:59.725766089 +0000 UTC m=+435.026673594" watchObservedRunningTime="2025-12-02 20:04:59.747170075 +0000 UTC m=+435.048077570" Dec 02 20:05:00 crc kubenswrapper[4807]: I1202 20:05:00.281768 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-khjdb" Dec 02 20:05:00 crc kubenswrapper[4807]: I1202 20:05:00.282165 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-khjdb" Dec 02 20:05:00 crc kubenswrapper[4807]: I1202 20:05:00.341008 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-khjdb" Dec 02 20:05:00 crc kubenswrapper[4807]: I1202 20:05:00.361792 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jl5sd" podStartSLOduration=2.774213944 podStartE2EDuration="8.361771745s" podCreationTimestamp="2025-12-02 20:04:52 +0000 UTC" firstStartedPulling="2025-12-02 20:04:53.598959525 +0000 UTC m=+428.899867020" lastFinishedPulling="2025-12-02 20:04:59.186517326 +0000 UTC m=+434.487424821" observedRunningTime="2025-12-02 20:04:59.750985891 +0000 UTC m=+435.051893406" watchObservedRunningTime="2025-12-02 20:05:00.361771745 +0000 UTC m=+435.662679230" Dec 02 20:05:00 crc kubenswrapper[4807]: I1202 20:05:00.498449 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qfvxm" Dec 02 20:05:00 crc kubenswrapper[4807]: I1202 20:05:00.498519 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qfvxm" Dec 02 20:05:00 crc kubenswrapper[4807]: I1202 20:05:00.541842 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qfvxm" Dec 02 20:05:00 crc kubenswrapper[4807]: I1202 20:05:00.714256 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"22c2f7c6316bc285de3478f908e45f30636569ac437e807d0cfac9e66d5f44cd"} Dec 02 20:05:00 crc kubenswrapper[4807]: I1202 20:05:00.757537 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-khjdb" Dec 02 20:05:00 crc kubenswrapper[4807]: I1202 20:05:00.760677 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qfvxm" Dec 02 20:05:02 crc kubenswrapper[4807]: I1202 20:05:02.692129 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jl5sd" Dec 02 20:05:02 crc kubenswrapper[4807]: I1202 20:05:02.692525 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jl5sd" Dec 02 20:05:02 crc kubenswrapper[4807]: I1202 20:05:02.737610 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jl5sd" Dec 02 20:05:02 crc kubenswrapper[4807]: I1202 20:05:02.875010 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:05:02 crc kubenswrapper[4807]: I1202 20:05:02.875073 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:05:03 crc kubenswrapper[4807]: I1202 20:05:03.919907 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kpfcc" podUID="7a430d81-93d6-44ac-b492-762898abc32c" containerName="registry-server" probeResult="failure" output=< Dec 02 20:05:03 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Dec 02 20:05:03 crc kubenswrapper[4807]: > Dec 02 20:05:12 crc kubenswrapper[4807]: I1202 20:05:12.734038 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jl5sd" Dec 02 20:05:12 crc kubenswrapper[4807]: I1202 20:05:12.915277 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:05:12 crc kubenswrapper[4807]: I1202 20:05:12.958489 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:07:28 crc kubenswrapper[4807]: I1202 20:07:28.293013 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:07:28 crc kubenswrapper[4807]: I1202 20:07:28.293689 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:07:58 crc kubenswrapper[4807]: I1202 20:07:58.292880 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:07:58 crc kubenswrapper[4807]: I1202 20:07:58.293551 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:08:28 crc kubenswrapper[4807]: I1202 20:08:28.292688 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:08:28 crc kubenswrapper[4807]: I1202 20:08:28.293360 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:08:28 crc kubenswrapper[4807]: I1202 20:08:28.293431 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 20:08:28 crc kubenswrapper[4807]: I1202 20:08:28.294287 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22c2f7c6316bc285de3478f908e45f30636569ac437e807d0cfac9e66d5f44cd"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 20:08:28 crc kubenswrapper[4807]: I1202 20:08:28.294455 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://22c2f7c6316bc285de3478f908e45f30636569ac437e807d0cfac9e66d5f44cd" gracePeriod=600 Dec 02 20:08:29 crc kubenswrapper[4807]: I1202 20:08:29.034210 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="22c2f7c6316bc285de3478f908e45f30636569ac437e807d0cfac9e66d5f44cd" exitCode=0 Dec 02 20:08:29 crc kubenswrapper[4807]: I1202 20:08:29.034276 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"22c2f7c6316bc285de3478f908e45f30636569ac437e807d0cfac9e66d5f44cd"} Dec 02 20:08:29 crc kubenswrapper[4807]: I1202 20:08:29.034800 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"3f19a7fc149c326bcfbb3a1b01b23ae680a3166497ae907c41fb2e7aadf72abf"} Dec 02 20:08:29 crc kubenswrapper[4807]: I1202 20:08:29.034832 4807 scope.go:117] "RemoveContainer" containerID="921e128bf7c751c48abace6e4b39de2d8b371ebc3478e06c957ad3086674020b" Dec 02 20:10:16 crc kubenswrapper[4807]: I1202 20:10:16.340777 4807 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 20:10:28 crc kubenswrapper[4807]: I1202 20:10:28.292819 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:10:28 crc kubenswrapper[4807]: I1202 20:10:28.293408 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:10:58 crc kubenswrapper[4807]: I1202 20:10:58.292757 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:10:58 crc kubenswrapper[4807]: I1202 20:10:58.293381 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.016131 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wkdtr"] Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.017339 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-wkdtr" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.024437 4807 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-dl4vp" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.024688 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.027071 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.033141 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-mx7pp"] Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.034154 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-mx7pp" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.034252 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zchnh\" (UniqueName: \"kubernetes.io/projected/2d1fe4d2-cc39-4c0c-a5e4-60366c119f94-kube-api-access-zchnh\") pod \"cert-manager-cainjector-7f985d654d-wkdtr\" (UID: \"2d1fe4d2-cc39-4c0c-a5e4-60366c119f94\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wkdtr" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.042049 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wkdtr"] Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.044193 4807 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-bsbmz" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.050216 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-mx7pp"] Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.064710 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-fcc6m"] Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.065794 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-fcc6m" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.070060 4807 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-h6cst" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.077245 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-fcc6m"] Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.135659 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zchnh\" (UniqueName: \"kubernetes.io/projected/2d1fe4d2-cc39-4c0c-a5e4-60366c119f94-kube-api-access-zchnh\") pod \"cert-manager-cainjector-7f985d654d-wkdtr\" (UID: \"2d1fe4d2-cc39-4c0c-a5e4-60366c119f94\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wkdtr" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.159116 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zchnh\" (UniqueName: \"kubernetes.io/projected/2d1fe4d2-cc39-4c0c-a5e4-60366c119f94-kube-api-access-zchnh\") pod \"cert-manager-cainjector-7f985d654d-wkdtr\" (UID: \"2d1fe4d2-cc39-4c0c-a5e4-60366c119f94\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wkdtr" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.237408 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz2t4\" (UniqueName: \"kubernetes.io/projected/d47db1e8-80ed-44a6-9273-9d9fd2b05e33-kube-api-access-nz2t4\") pod \"cert-manager-5b446d88c5-mx7pp\" (UID: \"d47db1e8-80ed-44a6-9273-9d9fd2b05e33\") " pod="cert-manager/cert-manager-5b446d88c5-mx7pp" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.237507 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx5k2\" (UniqueName: \"kubernetes.io/projected/1e90575d-2771-427e-a759-824575491965-kube-api-access-wx5k2\") pod \"cert-manager-webhook-5655c58dd6-fcc6m\" (UID: \"1e90575d-2771-427e-a759-824575491965\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-fcc6m" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.338675 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-wkdtr" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.339367 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz2t4\" (UniqueName: \"kubernetes.io/projected/d47db1e8-80ed-44a6-9273-9d9fd2b05e33-kube-api-access-nz2t4\") pod \"cert-manager-5b446d88c5-mx7pp\" (UID: \"d47db1e8-80ed-44a6-9273-9d9fd2b05e33\") " pod="cert-manager/cert-manager-5b446d88c5-mx7pp" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.339469 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx5k2\" (UniqueName: \"kubernetes.io/projected/1e90575d-2771-427e-a759-824575491965-kube-api-access-wx5k2\") pod \"cert-manager-webhook-5655c58dd6-fcc6m\" (UID: \"1e90575d-2771-427e-a759-824575491965\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-fcc6m" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.360243 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz2t4\" (UniqueName: \"kubernetes.io/projected/d47db1e8-80ed-44a6-9273-9d9fd2b05e33-kube-api-access-nz2t4\") pod \"cert-manager-5b446d88c5-mx7pp\" (UID: \"d47db1e8-80ed-44a6-9273-9d9fd2b05e33\") " pod="cert-manager/cert-manager-5b446d88c5-mx7pp" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.363080 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx5k2\" (UniqueName: \"kubernetes.io/projected/1e90575d-2771-427e-a759-824575491965-kube-api-access-wx5k2\") pod \"cert-manager-webhook-5655c58dd6-fcc6m\" (UID: \"1e90575d-2771-427e-a759-824575491965\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-fcc6m" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.383573 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-fcc6m" Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.605102 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wkdtr"] Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.618020 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.645995 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-fcc6m"] Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.649436 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-mx7pp" Dec 02 20:11:02 crc kubenswrapper[4807]: W1202 20:11:02.651582 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e90575d_2771_427e_a759_824575491965.slice/crio-81a78373160ea94edf6d09bfcfb2143b11afb510f225af91523302ce23bf368f WatchSource:0}: Error finding container 81a78373160ea94edf6d09bfcfb2143b11afb510f225af91523302ce23bf368f: Status 404 returned error can't find the container with id 81a78373160ea94edf6d09bfcfb2143b11afb510f225af91523302ce23bf368f Dec 02 20:11:02 crc kubenswrapper[4807]: I1202 20:11:02.859464 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-mx7pp"] Dec 02 20:11:02 crc kubenswrapper[4807]: W1202 20:11:02.867052 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd47db1e8_80ed_44a6_9273_9d9fd2b05e33.slice/crio-f821d034fad5fb388b289e4a6896ea91d186edbc79ad56c01625e246e503a5b2 WatchSource:0}: Error finding container f821d034fad5fb388b289e4a6896ea91d186edbc79ad56c01625e246e503a5b2: Status 404 returned error can't find the container with id f821d034fad5fb388b289e4a6896ea91d186edbc79ad56c01625e246e503a5b2 Dec 02 20:11:03 crc kubenswrapper[4807]: I1202 20:11:03.037034 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-fcc6m" event={"ID":"1e90575d-2771-427e-a759-824575491965","Type":"ContainerStarted","Data":"81a78373160ea94edf6d09bfcfb2143b11afb510f225af91523302ce23bf368f"} Dec 02 20:11:03 crc kubenswrapper[4807]: I1202 20:11:03.038219 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-wkdtr" event={"ID":"2d1fe4d2-cc39-4c0c-a5e4-60366c119f94","Type":"ContainerStarted","Data":"ebc9892a0f04aaa043550db5e43a4d36d4d8654bd92be550007505a300d03d2c"} Dec 02 20:11:03 crc kubenswrapper[4807]: I1202 20:11:03.039262 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-mx7pp" event={"ID":"d47db1e8-80ed-44a6-9273-9d9fd2b05e33","Type":"ContainerStarted","Data":"f821d034fad5fb388b289e4a6896ea91d186edbc79ad56c01625e246e503a5b2"} Dec 02 20:11:06 crc kubenswrapper[4807]: I1202 20:11:06.059078 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-wkdtr" event={"ID":"2d1fe4d2-cc39-4c0c-a5e4-60366c119f94","Type":"ContainerStarted","Data":"c979df53942c73a762e768bfeddfd9bb64d702df8d2c2d2e8f3160f782ec9956"} Dec 02 20:11:06 crc kubenswrapper[4807]: I1202 20:11:06.076948 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-wkdtr" podStartSLOduration=1.6655144339999999 podStartE2EDuration="4.076927741s" podCreationTimestamp="2025-12-02 20:11:02 +0000 UTC" firstStartedPulling="2025-12-02 20:11:02.617691283 +0000 UTC m=+797.918598778" lastFinishedPulling="2025-12-02 20:11:05.02910459 +0000 UTC m=+800.330012085" observedRunningTime="2025-12-02 20:11:06.074288937 +0000 UTC m=+801.375196432" watchObservedRunningTime="2025-12-02 20:11:06.076927741 +0000 UTC m=+801.377835236" Dec 02 20:11:08 crc kubenswrapper[4807]: I1202 20:11:08.074432 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-mx7pp" event={"ID":"d47db1e8-80ed-44a6-9273-9d9fd2b05e33","Type":"ContainerStarted","Data":"df4bf833f34f29523970fd9466b8f391cb26147ffbc0a3d87e3ba19ca7825761"} Dec 02 20:11:08 crc kubenswrapper[4807]: I1202 20:11:08.077087 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-fcc6m" event={"ID":"1e90575d-2771-427e-a759-824575491965","Type":"ContainerStarted","Data":"29c09aa3f22919427c893d79f5bcd7d93bcefffe8be6af454a9bb5ee7b182fe3"} Dec 02 20:11:08 crc kubenswrapper[4807]: I1202 20:11:08.077273 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-fcc6m" Dec 02 20:11:08 crc kubenswrapper[4807]: I1202 20:11:08.111500 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-mx7pp" podStartSLOduration=1.717104207 podStartE2EDuration="6.111482427s" podCreationTimestamp="2025-12-02 20:11:02 +0000 UTC" firstStartedPulling="2025-12-02 20:11:02.870264915 +0000 UTC m=+798.171172410" lastFinishedPulling="2025-12-02 20:11:07.264643115 +0000 UTC m=+802.565550630" observedRunningTime="2025-12-02 20:11:08.090959215 +0000 UTC m=+803.391866730" watchObservedRunningTime="2025-12-02 20:11:08.111482427 +0000 UTC m=+803.412389922" Dec 02 20:11:08 crc kubenswrapper[4807]: I1202 20:11:08.111816 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-fcc6m" podStartSLOduration=1.7976973790000002 podStartE2EDuration="6.111810766s" podCreationTimestamp="2025-12-02 20:11:02 +0000 UTC" firstStartedPulling="2025-12-02 20:11:02.666543306 +0000 UTC m=+797.967450801" lastFinishedPulling="2025-12-02 20:11:06.980656693 +0000 UTC m=+802.281564188" observedRunningTime="2025-12-02 20:11:08.109260944 +0000 UTC m=+803.410168449" watchObservedRunningTime="2025-12-02 20:11:08.111810766 +0000 UTC m=+803.412718261" Dec 02 20:11:11 crc kubenswrapper[4807]: I1202 20:11:11.976200 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5plsn"] Dec 02 20:11:11 crc kubenswrapper[4807]: I1202 20:11:11.977528 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovn-controller" containerID="cri-o://07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def" gracePeriod=30 Dec 02 20:11:11 crc kubenswrapper[4807]: I1202 20:11:11.977599 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="nbdb" containerID="cri-o://8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a" gracePeriod=30 Dec 02 20:11:11 crc kubenswrapper[4807]: I1202 20:11:11.977796 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="northd" containerID="cri-o://44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a" gracePeriod=30 Dec 02 20:11:11 crc kubenswrapper[4807]: I1202 20:11:11.977919 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db" gracePeriod=30 Dec 02 20:11:11 crc kubenswrapper[4807]: I1202 20:11:11.978012 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="kube-rbac-proxy-node" containerID="cri-o://1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5" gracePeriod=30 Dec 02 20:11:11 crc kubenswrapper[4807]: I1202 20:11:11.978111 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovn-acl-logging" containerID="cri-o://7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5" gracePeriod=30 Dec 02 20:11:11 crc kubenswrapper[4807]: I1202 20:11:11.978192 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="sbdb" containerID="cri-o://60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e" gracePeriod=30 Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.029367 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovnkube-controller" containerID="cri-o://650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2" gracePeriod=30 Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.107980 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/3.log" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.109385 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovn-acl-logging/0.log" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.109852 4807 generic.go:334] "Generic (PLEG): container finished" podID="798a6158-a963-43b4-941e-ac4f3df2f883" containerID="bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db" exitCode=0 Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.109919 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerDied","Data":"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db"} Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.111187 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f5x8r_8a909a25-5ede-458e-af78-4a41b79716a5/kube-multus/2.log" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.112296 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f5x8r_8a909a25-5ede-458e-af78-4a41b79716a5/kube-multus/1.log" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.112315 4807 generic.go:334] "Generic (PLEG): container finished" podID="8a909a25-5ede-458e-af78-4a41b79716a5" containerID="36732a036554b4a2b688a0b85aa52399128710e2139572d5ad3f190bb95b4a72" exitCode=2 Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.112334 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f5x8r" event={"ID":"8a909a25-5ede-458e-af78-4a41b79716a5","Type":"ContainerDied","Data":"36732a036554b4a2b688a0b85aa52399128710e2139572d5ad3f190bb95b4a72"} Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.112357 4807 scope.go:117] "RemoveContainer" containerID="c3d0b1c29f6c59ee7f954123d78e57d3bc99b5d9618ebbf455d6b4cb71a08d63" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.113036 4807 scope.go:117] "RemoveContainer" containerID="36732a036554b4a2b688a0b85aa52399128710e2139572d5ad3f190bb95b4a72" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.361506 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/3.log" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.364362 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovn-acl-logging/0.log" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.364903 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovn-controller/0.log" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.365321 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.387296 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-fcc6m" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434062 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-825dk"] Dec 02 20:11:12 crc kubenswrapper[4807]: E1202 20:11:12.434401 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovn-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434420 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovn-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: E1202 20:11:12.434434 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovnkube-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434442 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovnkube-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: E1202 20:11:12.434450 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovnkube-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434456 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovnkube-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: E1202 20:11:12.434463 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="kube-rbac-proxy-node" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434470 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="kube-rbac-proxy-node" Dec 02 20:11:12 crc kubenswrapper[4807]: E1202 20:11:12.434479 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="sbdb" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434486 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="sbdb" Dec 02 20:11:12 crc kubenswrapper[4807]: E1202 20:11:12.434496 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovn-acl-logging" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434505 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovn-acl-logging" Dec 02 20:11:12 crc kubenswrapper[4807]: E1202 20:11:12.434517 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="northd" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434527 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="northd" Dec 02 20:11:12 crc kubenswrapper[4807]: E1202 20:11:12.434541 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="kubecfg-setup" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434551 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="kubecfg-setup" Dec 02 20:11:12 crc kubenswrapper[4807]: E1202 20:11:12.434559 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434567 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 20:11:12 crc kubenswrapper[4807]: E1202 20:11:12.434581 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="nbdb" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434589 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="nbdb" Dec 02 20:11:12 crc kubenswrapper[4807]: E1202 20:11:12.434597 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovnkube-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434604 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovnkube-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434755 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="northd" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434768 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovn-acl-logging" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434781 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="kube-rbac-proxy-node" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434791 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovnkube-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434805 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="sbdb" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434819 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovnkube-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434831 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovnkube-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434840 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovn-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434848 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovnkube-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434857 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="nbdb" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434869 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 20:11:12 crc kubenswrapper[4807]: E1202 20:11:12.434983 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovnkube-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.434993 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovnkube-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: E1202 20:11:12.435002 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovnkube-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.435010 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovnkube-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.435125 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" containerName="ovnkube-controller" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.439101 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.482142 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdj78\" (UniqueName: \"kubernetes.io/projected/798a6158-a963-43b4-941e-ac4f3df2f883-kube-api-access-tdj78\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.482754 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-slash\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.482826 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-ovn\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.482849 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-ovnkube-script-lib\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.482881 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-env-overrides\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.482896 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-etc-openvswitch\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.482911 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-run-netns\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.482928 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-var-lib-openvswitch\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.482942 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-systemd-units\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.482963 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-cni-netd\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.482992 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-cni-bin\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483030 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/798a6158-a963-43b4-941e-ac4f3df2f883-ovn-node-metrics-cert\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483066 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-log-socket\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483085 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-openvswitch\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483105 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-var-lib-cni-networks-ovn-kubernetes\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483132 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-node-log\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483150 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-kubelet\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483183 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-systemd\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483209 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-run-ovn-kubernetes\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483226 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-ovnkube-config\") pod \"798a6158-a963-43b4-941e-ac4f3df2f883\" (UID: \"798a6158-a963-43b4-941e-ac4f3df2f883\") " Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483287 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483380 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6a4dd56-a9ea-4c13-91ba-25869958233a-ovnkube-script-lib\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483423 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483446 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483463 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483480 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483497 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483512 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483510 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483582 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-node-log" (OuterVolumeSpecName: "node-log") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483612 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-run-openvswitch\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483638 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-log-socket" (OuterVolumeSpecName: "log-socket") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483647 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-log-socket\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483675 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483680 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-cni-netd\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483780 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483833 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-run-systemd\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483871 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-var-lib-openvswitch\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483920 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6a4dd56-a9ea-4c13-91ba-25869958233a-env-overrides\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.483962 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-kubelet\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.484003 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-run-ovn\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.484029 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.484135 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.484134 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.484166 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.484344 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.484630 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-slash" (OuterVolumeSpecName: "host-slash") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.484675 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-node-log\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.484777 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-slash\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.484873 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-systemd-units\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.484911 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-cni-bin\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.484961 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbq5t\" (UniqueName: \"kubernetes.io/projected/d6a4dd56-a9ea-4c13-91ba-25869958233a-kube-api-access-dbq5t\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.484988 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6a4dd56-a9ea-4c13-91ba-25869958233a-ovn-node-metrics-cert\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485038 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-run-ovn-kubernetes\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485177 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6a4dd56-a9ea-4c13-91ba-25869958233a-ovnkube-config\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485261 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-etc-openvswitch\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485341 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-run-netns\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485535 4807 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485563 4807 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-log-socket\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485578 4807 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485596 4807 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485611 4807 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-node-log\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485624 4807 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485641 4807 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485656 4807 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485668 4807 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-slash\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485680 4807 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485692 4807 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485707 4807 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/798a6158-a963-43b4-941e-ac4f3df2f883-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485736 4807 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485747 4807 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485759 4807 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485770 4807 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.485781 4807 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.488025 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798a6158-a963-43b4-941e-ac4f3df2f883-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.488239 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/798a6158-a963-43b4-941e-ac4f3df2f883-kube-api-access-tdj78" (OuterVolumeSpecName: "kube-api-access-tdj78") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "kube-api-access-tdj78". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.496192 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "798a6158-a963-43b4-941e-ac4f3df2f883" (UID: "798a6158-a963-43b4-941e-ac4f3df2f883"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587062 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6a4dd56-a9ea-4c13-91ba-25869958233a-ovnkube-config\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587129 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-etc-openvswitch\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587152 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-run-netns\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587176 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6a4dd56-a9ea-4c13-91ba-25869958233a-ovnkube-script-lib\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587208 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-run-openvswitch\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587227 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-log-socket\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587230 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-etc-openvswitch\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587268 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-run-netns\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587304 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-log-socket\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587320 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-run-openvswitch\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587277 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-cni-netd\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587250 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-cni-netd\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587400 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587437 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-run-systemd\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587447 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587463 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-var-lib-openvswitch\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587493 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-run-systemd\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587508 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6a4dd56-a9ea-4c13-91ba-25869958233a-env-overrides\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587521 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-var-lib-openvswitch\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587540 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-kubelet\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587574 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-run-ovn\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587654 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-node-log\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587683 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-slash\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587781 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-systemd-units\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587805 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-cni-bin\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587835 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbq5t\" (UniqueName: \"kubernetes.io/projected/d6a4dd56-a9ea-4c13-91ba-25869958233a-kube-api-access-dbq5t\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587864 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6a4dd56-a9ea-4c13-91ba-25869958233a-ovn-node-metrics-cert\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.587918 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-run-ovn-kubernetes\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.588008 4807 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/798a6158-a963-43b4-941e-ac4f3df2f883-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.588032 4807 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/798a6158-a963-43b4-941e-ac4f3df2f883-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.588044 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdj78\" (UniqueName: \"kubernetes.io/projected/798a6158-a963-43b4-941e-ac4f3df2f883-kube-api-access-tdj78\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.588076 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d6a4dd56-a9ea-4c13-91ba-25869958233a-ovnkube-config\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.588083 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-run-ovn-kubernetes\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.588107 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-systemd-units\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.588119 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d6a4dd56-a9ea-4c13-91ba-25869958233a-env-overrides\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.588132 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-cni-bin\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.588149 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-run-ovn\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.588184 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-kubelet\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.588215 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-node-log\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.588296 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d6a4dd56-a9ea-4c13-91ba-25869958233a-ovnkube-script-lib\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.588392 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6a4dd56-a9ea-4c13-91ba-25869958233a-host-slash\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.591791 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d6a4dd56-a9ea-4c13-91ba-25869958233a-ovn-node-metrics-cert\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.603921 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbq5t\" (UniqueName: \"kubernetes.io/projected/d6a4dd56-a9ea-4c13-91ba-25869958233a-kube-api-access-dbq5t\") pod \"ovnkube-node-825dk\" (UID: \"d6a4dd56-a9ea-4c13-91ba-25869958233a\") " pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: I1202 20:11:12.755129 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:12 crc kubenswrapper[4807]: W1202 20:11:12.776398 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6a4dd56_a9ea_4c13_91ba_25869958233a.slice/crio-32e70dbd8bef941a95c88f8f4640fe85397b4c4d7ac2fb90b4a89b1fbe17cb07 WatchSource:0}: Error finding container 32e70dbd8bef941a95c88f8f4640fe85397b4c4d7ac2fb90b4a89b1fbe17cb07: Status 404 returned error can't find the container with id 32e70dbd8bef941a95c88f8f4640fe85397b4c4d7ac2fb90b4a89b1fbe17cb07 Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.124521 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f5x8r_8a909a25-5ede-458e-af78-4a41b79716a5/kube-multus/2.log" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.125347 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f5x8r" event={"ID":"8a909a25-5ede-458e-af78-4a41b79716a5","Type":"ContainerStarted","Data":"781c8b39891f96e4e9fcdb5336ddfef68704b82ac8cc4a6144bc2b8d54a93bf6"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.128185 4807 generic.go:334] "Generic (PLEG): container finished" podID="d6a4dd56-a9ea-4c13-91ba-25869958233a" containerID="8625ae299fd90cf095a447f71f73e3b375042166b91570b214bad9e0f98a129e" exitCode=0 Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.128311 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" event={"ID":"d6a4dd56-a9ea-4c13-91ba-25869958233a","Type":"ContainerDied","Data":"8625ae299fd90cf095a447f71f73e3b375042166b91570b214bad9e0f98a129e"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.128360 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" event={"ID":"d6a4dd56-a9ea-4c13-91ba-25869958233a","Type":"ContainerStarted","Data":"32e70dbd8bef941a95c88f8f4640fe85397b4c4d7ac2fb90b4a89b1fbe17cb07"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.136098 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovnkube-controller/3.log" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.139658 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovn-acl-logging/0.log" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.144361 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5plsn_798a6158-a963-43b4-941e-ac4f3df2f883/ovn-controller/0.log" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145244 4807 generic.go:334] "Generic (PLEG): container finished" podID="798a6158-a963-43b4-941e-ac4f3df2f883" containerID="650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2" exitCode=0 Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145295 4807 generic.go:334] "Generic (PLEG): container finished" podID="798a6158-a963-43b4-941e-ac4f3df2f883" containerID="60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e" exitCode=0 Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145311 4807 generic.go:334] "Generic (PLEG): container finished" podID="798a6158-a963-43b4-941e-ac4f3df2f883" containerID="8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a" exitCode=0 Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145330 4807 generic.go:334] "Generic (PLEG): container finished" podID="798a6158-a963-43b4-941e-ac4f3df2f883" containerID="44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a" exitCode=0 Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145344 4807 generic.go:334] "Generic (PLEG): container finished" podID="798a6158-a963-43b4-941e-ac4f3df2f883" containerID="1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5" exitCode=0 Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145357 4807 generic.go:334] "Generic (PLEG): container finished" podID="798a6158-a963-43b4-941e-ac4f3df2f883" containerID="7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5" exitCode=143 Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145371 4807 generic.go:334] "Generic (PLEG): container finished" podID="798a6158-a963-43b4-941e-ac4f3df2f883" containerID="07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def" exitCode=143 Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145407 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerDied","Data":"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145453 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerDied","Data":"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145476 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerDied","Data":"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145495 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerDied","Data":"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145513 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerDied","Data":"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145530 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerDied","Data":"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145552 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145573 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145583 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145592 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145602 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145612 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145621 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145631 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145640 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145653 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerDied","Data":"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145668 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145682 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145691 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145701 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145709 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145809 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145821 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145830 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145839 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145860 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145874 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" event={"ID":"798a6158-a963-43b4-941e-ac4f3df2f883","Type":"ContainerDied","Data":"35727bb2a991ed917b170c54a85ae1b0b9ecd4d65856f8406ea608b4ce53f23f"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145889 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145901 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145911 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145920 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145930 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145939 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145948 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145960 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145968 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.145978 4807 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849"} Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.146001 4807 scope.go:117] "RemoveContainer" containerID="650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.146032 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5plsn" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.168471 4807 scope.go:117] "RemoveContainer" containerID="4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.202850 4807 scope.go:117] "RemoveContainer" containerID="60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.218015 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5plsn"] Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.228572 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5plsn"] Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.250616 4807 scope.go:117] "RemoveContainer" containerID="8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.268757 4807 scope.go:117] "RemoveContainer" containerID="44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.287359 4807 scope.go:117] "RemoveContainer" containerID="bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.307408 4807 scope.go:117] "RemoveContainer" containerID="1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.328364 4807 scope.go:117] "RemoveContainer" containerID="7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.362970 4807 scope.go:117] "RemoveContainer" containerID="07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.381963 4807 scope.go:117] "RemoveContainer" containerID="49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.401072 4807 scope.go:117] "RemoveContainer" containerID="650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2" Dec 02 20:11:13 crc kubenswrapper[4807]: E1202 20:11:13.401520 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2\": container with ID starting with 650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2 not found: ID does not exist" containerID="650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.401569 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2"} err="failed to get container status \"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2\": rpc error: code = NotFound desc = could not find container \"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2\": container with ID starting with 650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2 not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.401603 4807 scope.go:117] "RemoveContainer" containerID="4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c" Dec 02 20:11:13 crc kubenswrapper[4807]: E1202 20:11:13.401938 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c\": container with ID starting with 4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c not found: ID does not exist" containerID="4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.401998 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c"} err="failed to get container status \"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c\": rpc error: code = NotFound desc = could not find container \"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c\": container with ID starting with 4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.402037 4807 scope.go:117] "RemoveContainer" containerID="60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e" Dec 02 20:11:13 crc kubenswrapper[4807]: E1202 20:11:13.402372 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\": container with ID starting with 60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e not found: ID does not exist" containerID="60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.402402 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e"} err="failed to get container status \"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\": rpc error: code = NotFound desc = could not find container \"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\": container with ID starting with 60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.402427 4807 scope.go:117] "RemoveContainer" containerID="8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a" Dec 02 20:11:13 crc kubenswrapper[4807]: E1202 20:11:13.402963 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\": container with ID starting with 8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a not found: ID does not exist" containerID="8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.402993 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a"} err="failed to get container status \"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\": rpc error: code = NotFound desc = could not find container \"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\": container with ID starting with 8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.403010 4807 scope.go:117] "RemoveContainer" containerID="44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a" Dec 02 20:11:13 crc kubenswrapper[4807]: E1202 20:11:13.403426 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\": container with ID starting with 44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a not found: ID does not exist" containerID="44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.403471 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a"} err="failed to get container status \"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\": rpc error: code = NotFound desc = could not find container \"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\": container with ID starting with 44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.403493 4807 scope.go:117] "RemoveContainer" containerID="bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db" Dec 02 20:11:13 crc kubenswrapper[4807]: E1202 20:11:13.403844 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\": container with ID starting with bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db not found: ID does not exist" containerID="bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.403874 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db"} err="failed to get container status \"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\": rpc error: code = NotFound desc = could not find container \"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\": container with ID starting with bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.403895 4807 scope.go:117] "RemoveContainer" containerID="1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5" Dec 02 20:11:13 crc kubenswrapper[4807]: E1202 20:11:13.404250 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\": container with ID starting with 1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5 not found: ID does not exist" containerID="1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.404491 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5"} err="failed to get container status \"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\": rpc error: code = NotFound desc = could not find container \"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\": container with ID starting with 1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5 not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.404508 4807 scope.go:117] "RemoveContainer" containerID="7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5" Dec 02 20:11:13 crc kubenswrapper[4807]: E1202 20:11:13.404868 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\": container with ID starting with 7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5 not found: ID does not exist" containerID="7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.404911 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5"} err="failed to get container status \"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\": rpc error: code = NotFound desc = could not find container \"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\": container with ID starting with 7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5 not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.404929 4807 scope.go:117] "RemoveContainer" containerID="07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def" Dec 02 20:11:13 crc kubenswrapper[4807]: E1202 20:11:13.405235 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\": container with ID starting with 07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def not found: ID does not exist" containerID="07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.405458 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def"} err="failed to get container status \"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\": rpc error: code = NotFound desc = could not find container \"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\": container with ID starting with 07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.405474 4807 scope.go:117] "RemoveContainer" containerID="49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849" Dec 02 20:11:13 crc kubenswrapper[4807]: E1202 20:11:13.405878 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\": container with ID starting with 49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849 not found: ID does not exist" containerID="49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.405901 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849"} err="failed to get container status \"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\": rpc error: code = NotFound desc = could not find container \"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\": container with ID starting with 49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849 not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.405917 4807 scope.go:117] "RemoveContainer" containerID="650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.406225 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2"} err="failed to get container status \"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2\": rpc error: code = NotFound desc = could not find container \"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2\": container with ID starting with 650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2 not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.406265 4807 scope.go:117] "RemoveContainer" containerID="4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.406545 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c"} err="failed to get container status \"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c\": rpc error: code = NotFound desc = could not find container \"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c\": container with ID starting with 4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.406564 4807 scope.go:117] "RemoveContainer" containerID="60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.407011 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e"} err="failed to get container status \"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\": rpc error: code = NotFound desc = could not find container \"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\": container with ID starting with 60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.407032 4807 scope.go:117] "RemoveContainer" containerID="8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.407228 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a"} err="failed to get container status \"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\": rpc error: code = NotFound desc = could not find container \"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\": container with ID starting with 8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.407247 4807 scope.go:117] "RemoveContainer" containerID="44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.407454 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a"} err="failed to get container status \"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\": rpc error: code = NotFound desc = could not find container \"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\": container with ID starting with 44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.407477 4807 scope.go:117] "RemoveContainer" containerID="bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.407770 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db"} err="failed to get container status \"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\": rpc error: code = NotFound desc = could not find container \"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\": container with ID starting with bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.407793 4807 scope.go:117] "RemoveContainer" containerID="1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.408224 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5"} err="failed to get container status \"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\": rpc error: code = NotFound desc = could not find container \"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\": container with ID starting with 1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5 not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.408283 4807 scope.go:117] "RemoveContainer" containerID="7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.408782 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5"} err="failed to get container status \"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\": rpc error: code = NotFound desc = could not find container \"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\": container with ID starting with 7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5 not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.408854 4807 scope.go:117] "RemoveContainer" containerID="07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.409272 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def"} err="failed to get container status \"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\": rpc error: code = NotFound desc = could not find container \"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\": container with ID starting with 07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.409299 4807 scope.go:117] "RemoveContainer" containerID="49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.410167 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849"} err="failed to get container status \"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\": rpc error: code = NotFound desc = could not find container \"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\": container with ID starting with 49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849 not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.410193 4807 scope.go:117] "RemoveContainer" containerID="650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.410707 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2"} err="failed to get container status \"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2\": rpc error: code = NotFound desc = could not find container \"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2\": container with ID starting with 650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2 not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.410744 4807 scope.go:117] "RemoveContainer" containerID="4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.410971 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c"} err="failed to get container status \"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c\": rpc error: code = NotFound desc = could not find container \"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c\": container with ID starting with 4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.410990 4807 scope.go:117] "RemoveContainer" containerID="60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.411214 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e"} err="failed to get container status \"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\": rpc error: code = NotFound desc = could not find container \"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\": container with ID starting with 60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.411246 4807 scope.go:117] "RemoveContainer" containerID="8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.411698 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a"} err="failed to get container status \"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\": rpc error: code = NotFound desc = could not find container \"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\": container with ID starting with 8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.411992 4807 scope.go:117] "RemoveContainer" containerID="44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.412229 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a"} err="failed to get container status \"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\": rpc error: code = NotFound desc = could not find container \"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\": container with ID starting with 44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.412254 4807 scope.go:117] "RemoveContainer" containerID="bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.412474 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db"} err="failed to get container status \"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\": rpc error: code = NotFound desc = could not find container \"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\": container with ID starting with bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.412501 4807 scope.go:117] "RemoveContainer" containerID="1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.412773 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5"} err="failed to get container status \"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\": rpc error: code = NotFound desc = could not find container \"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\": container with ID starting with 1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5 not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.412793 4807 scope.go:117] "RemoveContainer" containerID="7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.413005 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5"} err="failed to get container status \"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\": rpc error: code = NotFound desc = could not find container \"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\": container with ID starting with 7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5 not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.413023 4807 scope.go:117] "RemoveContainer" containerID="07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.413205 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def"} err="failed to get container status \"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\": rpc error: code = NotFound desc = could not find container \"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\": container with ID starting with 07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.413222 4807 scope.go:117] "RemoveContainer" containerID="49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.413441 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849"} err="failed to get container status \"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\": rpc error: code = NotFound desc = could not find container \"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\": container with ID starting with 49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849 not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.413462 4807 scope.go:117] "RemoveContainer" containerID="650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.413906 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2"} err="failed to get container status \"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2\": rpc error: code = NotFound desc = could not find container \"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2\": container with ID starting with 650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2 not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.413960 4807 scope.go:117] "RemoveContainer" containerID="4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.414546 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c"} err="failed to get container status \"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c\": rpc error: code = NotFound desc = could not find container \"4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c\": container with ID starting with 4099dfdb702ae92e9ceb8906e9fb046222f1cb932254b72a7d3127167cb22d7c not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.414570 4807 scope.go:117] "RemoveContainer" containerID="60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.414864 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e"} err="failed to get container status \"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\": rpc error: code = NotFound desc = could not find container \"60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e\": container with ID starting with 60e029c54dbeb1281ca24daa2c2f8bab24c4d0a79c3629fc2cab9349c250756e not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.414883 4807 scope.go:117] "RemoveContainer" containerID="8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.415130 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a"} err="failed to get container status \"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\": rpc error: code = NotFound desc = could not find container \"8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a\": container with ID starting with 8c599a921470f2a334d72570d09116d697ea46ebcb4d6784ea0c6dc66654b90a not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.415153 4807 scope.go:117] "RemoveContainer" containerID="44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.415390 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a"} err="failed to get container status \"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\": rpc error: code = NotFound desc = could not find container \"44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a\": container with ID starting with 44be336079476c9ecfa14f0c563ef991dabe1c9d4b439d0a2540bca875840d1a not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.415413 4807 scope.go:117] "RemoveContainer" containerID="bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.415741 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db"} err="failed to get container status \"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\": rpc error: code = NotFound desc = could not find container \"bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db\": container with ID starting with bb32bf90d9ca162888ff4de1bf0fb54a9f6b9d2764c45a465b2da6a6467bf5db not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.415767 4807 scope.go:117] "RemoveContainer" containerID="1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.415984 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5"} err="failed to get container status \"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\": rpc error: code = NotFound desc = could not find container \"1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5\": container with ID starting with 1bd3a9d694c094b034c4a8d2b457fa01c4070553dab093e316a2d5198c30f7b5 not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.416007 4807 scope.go:117] "RemoveContainer" containerID="7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.416370 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5"} err="failed to get container status \"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\": rpc error: code = NotFound desc = could not find container \"7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5\": container with ID starting with 7dbe66443e514a50ccc1615c8c404cd77bb6992181c687de8c093ab7804298f5 not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.416392 4807 scope.go:117] "RemoveContainer" containerID="07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.416755 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def"} err="failed to get container status \"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\": rpc error: code = NotFound desc = could not find container \"07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def\": container with ID starting with 07332a93a5f2e147ce5884cef2e4e13ac25c6b4f4845aae459ba496700937def not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.416786 4807 scope.go:117] "RemoveContainer" containerID="49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.417090 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849"} err="failed to get container status \"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\": rpc error: code = NotFound desc = could not find container \"49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849\": container with ID starting with 49c817a65265c9d4e98f7cec7fc5914c0ff7cb55cf7da1456e02dd5348ab1849 not found: ID does not exist" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.417139 4807 scope.go:117] "RemoveContainer" containerID="650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2" Dec 02 20:11:13 crc kubenswrapper[4807]: I1202 20:11:13.417619 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2"} err="failed to get container status \"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2\": rpc error: code = NotFound desc = could not find container \"650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2\": container with ID starting with 650b19a0a58f6234963342d7c64c87d29b052c772826843f138266798baba0b2 not found: ID does not exist" Dec 02 20:11:14 crc kubenswrapper[4807]: I1202 20:11:14.156800 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" event={"ID":"d6a4dd56-a9ea-4c13-91ba-25869958233a","Type":"ContainerStarted","Data":"2f3e209a463836bff0923a3b569d9789e4d758d48edf1c3e7d3734b9b01b4435"} Dec 02 20:11:14 crc kubenswrapper[4807]: I1202 20:11:14.157146 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" event={"ID":"d6a4dd56-a9ea-4c13-91ba-25869958233a","Type":"ContainerStarted","Data":"6ca7375e4bc4775244409ca7dad80ce7a1e00bb3873fcaedeeb4947ffad0d685"} Dec 02 20:11:14 crc kubenswrapper[4807]: I1202 20:11:14.157166 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" event={"ID":"d6a4dd56-a9ea-4c13-91ba-25869958233a","Type":"ContainerStarted","Data":"e11bfc50a65dc3bfb3c64003da1dd49d532266bc15a064e09ff5fd4597bc8085"} Dec 02 20:11:14 crc kubenswrapper[4807]: I1202 20:11:14.157181 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" event={"ID":"d6a4dd56-a9ea-4c13-91ba-25869958233a","Type":"ContainerStarted","Data":"6f39a9f7537085d8f5a7cb63dd98e45da6e0b362eb133cbb3254bffeab5f82d5"} Dec 02 20:11:14 crc kubenswrapper[4807]: I1202 20:11:14.157194 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" event={"ID":"d6a4dd56-a9ea-4c13-91ba-25869958233a","Type":"ContainerStarted","Data":"fa880011a1b49eba595c6faa12a2cde2952b4ed222572b662906393db9e8a3e4"} Dec 02 20:11:14 crc kubenswrapper[4807]: I1202 20:11:14.157206 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" event={"ID":"d6a4dd56-a9ea-4c13-91ba-25869958233a","Type":"ContainerStarted","Data":"33175be62297f8cba80c209d8146e3d29a0a544eaf91d87d04dce04d1d127903"} Dec 02 20:11:14 crc kubenswrapper[4807]: I1202 20:11:14.982418 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="798a6158-a963-43b4-941e-ac4f3df2f883" path="/var/lib/kubelet/pods/798a6158-a963-43b4-941e-ac4f3df2f883/volumes" Dec 02 20:11:17 crc kubenswrapper[4807]: I1202 20:11:17.180176 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" event={"ID":"d6a4dd56-a9ea-4c13-91ba-25869958233a","Type":"ContainerStarted","Data":"29046317a6619374a7581c28c352a1eb9bb60a004081d6ca5c08dd84aa74832c"} Dec 02 20:11:19 crc kubenswrapper[4807]: I1202 20:11:19.199732 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" event={"ID":"d6a4dd56-a9ea-4c13-91ba-25869958233a","Type":"ContainerStarted","Data":"f341a8cdf886e5d867382a943bcda0d1fc5f1fe42dbe23fbe47fb54dd992805a"} Dec 02 20:11:19 crc kubenswrapper[4807]: I1202 20:11:19.200275 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:19 crc kubenswrapper[4807]: I1202 20:11:19.200290 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:19 crc kubenswrapper[4807]: I1202 20:11:19.200301 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:19 crc kubenswrapper[4807]: I1202 20:11:19.230346 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" podStartSLOduration=7.230326049 podStartE2EDuration="7.230326049s" podCreationTimestamp="2025-12-02 20:11:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:11:19.228756964 +0000 UTC m=+814.529664459" watchObservedRunningTime="2025-12-02 20:11:19.230326049 +0000 UTC m=+814.531233544" Dec 02 20:11:19 crc kubenswrapper[4807]: I1202 20:11:19.234375 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:19 crc kubenswrapper[4807]: I1202 20:11:19.235173 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:28 crc kubenswrapper[4807]: I1202 20:11:28.293431 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:11:28 crc kubenswrapper[4807]: I1202 20:11:28.294355 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:11:28 crc kubenswrapper[4807]: I1202 20:11:28.294434 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 20:11:28 crc kubenswrapper[4807]: I1202 20:11:28.295450 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f19a7fc149c326bcfbb3a1b01b23ae680a3166497ae907c41fb2e7aadf72abf"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 20:11:28 crc kubenswrapper[4807]: I1202 20:11:28.295567 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://3f19a7fc149c326bcfbb3a1b01b23ae680a3166497ae907c41fb2e7aadf72abf" gracePeriod=600 Dec 02 20:11:29 crc kubenswrapper[4807]: I1202 20:11:29.270649 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"3f19a7fc149c326bcfbb3a1b01b23ae680a3166497ae907c41fb2e7aadf72abf"} Dec 02 20:11:29 crc kubenswrapper[4807]: I1202 20:11:29.272029 4807 scope.go:117] "RemoveContainer" containerID="22c2f7c6316bc285de3478f908e45f30636569ac437e807d0cfac9e66d5f44cd" Dec 02 20:11:29 crc kubenswrapper[4807]: I1202 20:11:29.270783 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="3f19a7fc149c326bcfbb3a1b01b23ae680a3166497ae907c41fb2e7aadf72abf" exitCode=0 Dec 02 20:11:29 crc kubenswrapper[4807]: I1202 20:11:29.272339 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"e7e961064ac2bf5a3d6cced9f51594218b6b12f9bc4f97979713bb99fd8aad69"} Dec 02 20:11:42 crc kubenswrapper[4807]: I1202 20:11:42.780295 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-825dk" Dec 02 20:11:43 crc kubenswrapper[4807]: I1202 20:11:43.633847 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc"] Dec 02 20:11:43 crc kubenswrapper[4807]: I1202 20:11:43.636227 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" Dec 02 20:11:43 crc kubenswrapper[4807]: I1202 20:11:43.640143 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 20:11:43 crc kubenswrapper[4807]: I1202 20:11:43.649815 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc"] Dec 02 20:11:43 crc kubenswrapper[4807]: I1202 20:11:43.827209 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmk56\" (UniqueName: \"kubernetes.io/projected/25348ba1-760f-46a3-9f25-4054fb9ebed4-kube-api-access-nmk56\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc\" (UID: \"25348ba1-760f-46a3-9f25-4054fb9ebed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" Dec 02 20:11:43 crc kubenswrapper[4807]: I1202 20:11:43.827278 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25348ba1-760f-46a3-9f25-4054fb9ebed4-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc\" (UID: \"25348ba1-760f-46a3-9f25-4054fb9ebed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" Dec 02 20:11:43 crc kubenswrapper[4807]: I1202 20:11:43.827380 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25348ba1-760f-46a3-9f25-4054fb9ebed4-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc\" (UID: \"25348ba1-760f-46a3-9f25-4054fb9ebed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" Dec 02 20:11:43 crc kubenswrapper[4807]: I1202 20:11:43.928108 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmk56\" (UniqueName: \"kubernetes.io/projected/25348ba1-760f-46a3-9f25-4054fb9ebed4-kube-api-access-nmk56\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc\" (UID: \"25348ba1-760f-46a3-9f25-4054fb9ebed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" Dec 02 20:11:43 crc kubenswrapper[4807]: I1202 20:11:43.928175 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25348ba1-760f-46a3-9f25-4054fb9ebed4-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc\" (UID: \"25348ba1-760f-46a3-9f25-4054fb9ebed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" Dec 02 20:11:43 crc kubenswrapper[4807]: I1202 20:11:43.928233 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25348ba1-760f-46a3-9f25-4054fb9ebed4-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc\" (UID: \"25348ba1-760f-46a3-9f25-4054fb9ebed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" Dec 02 20:11:43 crc kubenswrapper[4807]: I1202 20:11:43.929102 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25348ba1-760f-46a3-9f25-4054fb9ebed4-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc\" (UID: \"25348ba1-760f-46a3-9f25-4054fb9ebed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" Dec 02 20:11:43 crc kubenswrapper[4807]: I1202 20:11:43.929321 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25348ba1-760f-46a3-9f25-4054fb9ebed4-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc\" (UID: \"25348ba1-760f-46a3-9f25-4054fb9ebed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" Dec 02 20:11:43 crc kubenswrapper[4807]: I1202 20:11:43.963228 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmk56\" (UniqueName: \"kubernetes.io/projected/25348ba1-760f-46a3-9f25-4054fb9ebed4-kube-api-access-nmk56\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc\" (UID: \"25348ba1-760f-46a3-9f25-4054fb9ebed4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" Dec 02 20:11:43 crc kubenswrapper[4807]: I1202 20:11:43.963531 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" Dec 02 20:11:44 crc kubenswrapper[4807]: I1202 20:11:44.173233 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc"] Dec 02 20:11:44 crc kubenswrapper[4807]: W1202 20:11:44.182008 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25348ba1_760f_46a3_9f25_4054fb9ebed4.slice/crio-c5a162fb2d477bd804227acd05a8db772c79dc271f17259d0a419726ecb66b9c WatchSource:0}: Error finding container c5a162fb2d477bd804227acd05a8db772c79dc271f17259d0a419726ecb66b9c: Status 404 returned error can't find the container with id c5a162fb2d477bd804227acd05a8db772c79dc271f17259d0a419726ecb66b9c Dec 02 20:11:44 crc kubenswrapper[4807]: I1202 20:11:44.371201 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" event={"ID":"25348ba1-760f-46a3-9f25-4054fb9ebed4","Type":"ContainerStarted","Data":"c5a162fb2d477bd804227acd05a8db772c79dc271f17259d0a419726ecb66b9c"} Dec 02 20:11:45 crc kubenswrapper[4807]: I1202 20:11:45.381002 4807 generic.go:334] "Generic (PLEG): container finished" podID="25348ba1-760f-46a3-9f25-4054fb9ebed4" containerID="623c7e0f64720a1d4899ad8c28540e1817ac06ba52a6eb4efb97f46765f00fc2" exitCode=0 Dec 02 20:11:45 crc kubenswrapper[4807]: I1202 20:11:45.381085 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" event={"ID":"25348ba1-760f-46a3-9f25-4054fb9ebed4","Type":"ContainerDied","Data":"623c7e0f64720a1d4899ad8c28540e1817ac06ba52a6eb4efb97f46765f00fc2"} Dec 02 20:11:45 crc kubenswrapper[4807]: E1202 20:11:45.720913 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25348ba1_760f_46a3_9f25_4054fb9ebed4.slice/crio-conmon-623c7e0f64720a1d4899ad8c28540e1817ac06ba52a6eb4efb97f46765f00fc2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25348ba1_760f_46a3_9f25_4054fb9ebed4.slice/crio-623c7e0f64720a1d4899ad8c28540e1817ac06ba52a6eb4efb97f46765f00fc2.scope\": RecentStats: unable to find data in memory cache]" Dec 02 20:11:45 crc kubenswrapper[4807]: I1202 20:11:45.954547 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7fbl6"] Dec 02 20:11:45 crc kubenswrapper[4807]: I1202 20:11:45.955940 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:11:45 crc kubenswrapper[4807]: I1202 20:11:45.956587 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa686bc1-b30b-4041-952e-cc0280094d85-catalog-content\") pod \"redhat-operators-7fbl6\" (UID: \"fa686bc1-b30b-4041-952e-cc0280094d85\") " pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:11:45 crc kubenswrapper[4807]: I1202 20:11:45.956734 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa686bc1-b30b-4041-952e-cc0280094d85-utilities\") pod \"redhat-operators-7fbl6\" (UID: \"fa686bc1-b30b-4041-952e-cc0280094d85\") " pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:11:45 crc kubenswrapper[4807]: I1202 20:11:45.956833 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s4rt\" (UniqueName: \"kubernetes.io/projected/fa686bc1-b30b-4041-952e-cc0280094d85-kube-api-access-5s4rt\") pod \"redhat-operators-7fbl6\" (UID: \"fa686bc1-b30b-4041-952e-cc0280094d85\") " pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:11:45 crc kubenswrapper[4807]: I1202 20:11:45.975661 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7fbl6"] Dec 02 20:11:46 crc kubenswrapper[4807]: I1202 20:11:46.057964 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s4rt\" (UniqueName: \"kubernetes.io/projected/fa686bc1-b30b-4041-952e-cc0280094d85-kube-api-access-5s4rt\") pod \"redhat-operators-7fbl6\" (UID: \"fa686bc1-b30b-4041-952e-cc0280094d85\") " pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:11:46 crc kubenswrapper[4807]: I1202 20:11:46.058099 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa686bc1-b30b-4041-952e-cc0280094d85-catalog-content\") pod \"redhat-operators-7fbl6\" (UID: \"fa686bc1-b30b-4041-952e-cc0280094d85\") " pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:11:46 crc kubenswrapper[4807]: I1202 20:11:46.058132 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa686bc1-b30b-4041-952e-cc0280094d85-utilities\") pod \"redhat-operators-7fbl6\" (UID: \"fa686bc1-b30b-4041-952e-cc0280094d85\") " pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:11:46 crc kubenswrapper[4807]: I1202 20:11:46.058666 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa686bc1-b30b-4041-952e-cc0280094d85-utilities\") pod \"redhat-operators-7fbl6\" (UID: \"fa686bc1-b30b-4041-952e-cc0280094d85\") " pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:11:46 crc kubenswrapper[4807]: I1202 20:11:46.058804 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa686bc1-b30b-4041-952e-cc0280094d85-catalog-content\") pod \"redhat-operators-7fbl6\" (UID: \"fa686bc1-b30b-4041-952e-cc0280094d85\") " pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:11:46 crc kubenswrapper[4807]: I1202 20:11:46.085580 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s4rt\" (UniqueName: \"kubernetes.io/projected/fa686bc1-b30b-4041-952e-cc0280094d85-kube-api-access-5s4rt\") pod \"redhat-operators-7fbl6\" (UID: \"fa686bc1-b30b-4041-952e-cc0280094d85\") " pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:11:46 crc kubenswrapper[4807]: I1202 20:11:46.278050 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:11:46 crc kubenswrapper[4807]: I1202 20:11:46.498876 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7fbl6"] Dec 02 20:11:47 crc kubenswrapper[4807]: I1202 20:11:47.400662 4807 generic.go:334] "Generic (PLEG): container finished" podID="fa686bc1-b30b-4041-952e-cc0280094d85" containerID="6942dd8686b085a959ff0d6732b430281a2998c2e64c27a823a07a09aa9bc167" exitCode=0 Dec 02 20:11:47 crc kubenswrapper[4807]: I1202 20:11:47.400951 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fbl6" event={"ID":"fa686bc1-b30b-4041-952e-cc0280094d85","Type":"ContainerDied","Data":"6942dd8686b085a959ff0d6732b430281a2998c2e64c27a823a07a09aa9bc167"} Dec 02 20:11:47 crc kubenswrapper[4807]: I1202 20:11:47.401094 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fbl6" event={"ID":"fa686bc1-b30b-4041-952e-cc0280094d85","Type":"ContainerStarted","Data":"1db0f4b90d5f651da6b5a73a49fc289bc1e25ca9a6cb6a86c4ab1e2348a60f78"} Dec 02 20:11:53 crc kubenswrapper[4807]: I1202 20:11:53.436755 4807 generic.go:334] "Generic (PLEG): container finished" podID="25348ba1-760f-46a3-9f25-4054fb9ebed4" containerID="2100fb72639d40357a782755084a5c740c6dfd377769b78e51c6664be6fb5423" exitCode=0 Dec 02 20:11:53 crc kubenswrapper[4807]: I1202 20:11:53.436815 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" event={"ID":"25348ba1-760f-46a3-9f25-4054fb9ebed4","Type":"ContainerDied","Data":"2100fb72639d40357a782755084a5c740c6dfd377769b78e51c6664be6fb5423"} Dec 02 20:11:53 crc kubenswrapper[4807]: I1202 20:11:53.439977 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fbl6" event={"ID":"fa686bc1-b30b-4041-952e-cc0280094d85","Type":"ContainerStarted","Data":"4b481fdf211b00336c494d2ff2e18c0544e37cedc69f28e0baa02271ddd0e80b"} Dec 02 20:11:54 crc kubenswrapper[4807]: I1202 20:11:54.448687 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" event={"ID":"25348ba1-760f-46a3-9f25-4054fb9ebed4","Type":"ContainerStarted","Data":"f366891748447fe798d6fd32712489d60a5be89181d583f05a78dae189e908db"} Dec 02 20:11:54 crc kubenswrapper[4807]: I1202 20:11:54.475604 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" podStartSLOduration=5.070510587 podStartE2EDuration="11.475581014s" podCreationTimestamp="2025-12-02 20:11:43 +0000 UTC" firstStartedPulling="2025-12-02 20:11:45.384664671 +0000 UTC m=+840.685572186" lastFinishedPulling="2025-12-02 20:11:51.789735098 +0000 UTC m=+847.090642613" observedRunningTime="2025-12-02 20:11:54.471317881 +0000 UTC m=+849.772225386" watchObservedRunningTime="2025-12-02 20:11:54.475581014 +0000 UTC m=+849.776488509" Dec 02 20:11:55 crc kubenswrapper[4807]: I1202 20:11:55.459468 4807 generic.go:334] "Generic (PLEG): container finished" podID="25348ba1-760f-46a3-9f25-4054fb9ebed4" containerID="f366891748447fe798d6fd32712489d60a5be89181d583f05a78dae189e908db" exitCode=0 Dec 02 20:11:55 crc kubenswrapper[4807]: I1202 20:11:55.459604 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" event={"ID":"25348ba1-760f-46a3-9f25-4054fb9ebed4","Type":"ContainerDied","Data":"f366891748447fe798d6fd32712489d60a5be89181d583f05a78dae189e908db"} Dec 02 20:11:55 crc kubenswrapper[4807]: I1202 20:11:55.463663 4807 generic.go:334] "Generic (PLEG): container finished" podID="fa686bc1-b30b-4041-952e-cc0280094d85" containerID="4b481fdf211b00336c494d2ff2e18c0544e37cedc69f28e0baa02271ddd0e80b" exitCode=0 Dec 02 20:11:55 crc kubenswrapper[4807]: I1202 20:11:55.463979 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fbl6" event={"ID":"fa686bc1-b30b-4041-952e-cc0280094d85","Type":"ContainerDied","Data":"4b481fdf211b00336c494d2ff2e18c0544e37cedc69f28e0baa02271ddd0e80b"} Dec 02 20:11:55 crc kubenswrapper[4807]: E1202 20:11:55.890084 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25348ba1_760f_46a3_9f25_4054fb9ebed4.slice/crio-conmon-623c7e0f64720a1d4899ad8c28540e1817ac06ba52a6eb4efb97f46765f00fc2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25348ba1_760f_46a3_9f25_4054fb9ebed4.slice/crio-623c7e0f64720a1d4899ad8c28540e1817ac06ba52a6eb4efb97f46765f00fc2.scope\": RecentStats: unable to find data in memory cache]" Dec 02 20:11:56 crc kubenswrapper[4807]: I1202 20:11:56.482350 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fbl6" event={"ID":"fa686bc1-b30b-4041-952e-cc0280094d85","Type":"ContainerStarted","Data":"1e25b5b635f539a9cf96e1d2d002a08f3e020e4e801783d62354c3fe91a18340"} Dec 02 20:11:56 crc kubenswrapper[4807]: I1202 20:11:56.510634 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7fbl6" podStartSLOduration=2.984296208 podStartE2EDuration="11.510613162s" podCreationTimestamp="2025-12-02 20:11:45 +0000 UTC" firstStartedPulling="2025-12-02 20:11:47.403235844 +0000 UTC m=+842.704143339" lastFinishedPulling="2025-12-02 20:11:55.929552788 +0000 UTC m=+851.230460293" observedRunningTime="2025-12-02 20:11:56.506111783 +0000 UTC m=+851.807019298" watchObservedRunningTime="2025-12-02 20:11:56.510613162 +0000 UTC m=+851.811520667" Dec 02 20:11:56 crc kubenswrapper[4807]: I1202 20:11:56.875811 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" Dec 02 20:11:57 crc kubenswrapper[4807]: I1202 20:11:57.021855 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25348ba1-760f-46a3-9f25-4054fb9ebed4-bundle\") pod \"25348ba1-760f-46a3-9f25-4054fb9ebed4\" (UID: \"25348ba1-760f-46a3-9f25-4054fb9ebed4\") " Dec 02 20:11:57 crc kubenswrapper[4807]: I1202 20:11:57.022000 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25348ba1-760f-46a3-9f25-4054fb9ebed4-util\") pod \"25348ba1-760f-46a3-9f25-4054fb9ebed4\" (UID: \"25348ba1-760f-46a3-9f25-4054fb9ebed4\") " Dec 02 20:11:57 crc kubenswrapper[4807]: I1202 20:11:57.022073 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmk56\" (UniqueName: \"kubernetes.io/projected/25348ba1-760f-46a3-9f25-4054fb9ebed4-kube-api-access-nmk56\") pod \"25348ba1-760f-46a3-9f25-4054fb9ebed4\" (UID: \"25348ba1-760f-46a3-9f25-4054fb9ebed4\") " Dec 02 20:11:57 crc kubenswrapper[4807]: I1202 20:11:57.024033 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25348ba1-760f-46a3-9f25-4054fb9ebed4-bundle" (OuterVolumeSpecName: "bundle") pod "25348ba1-760f-46a3-9f25-4054fb9ebed4" (UID: "25348ba1-760f-46a3-9f25-4054fb9ebed4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:11:57 crc kubenswrapper[4807]: I1202 20:11:57.028085 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25348ba1-760f-46a3-9f25-4054fb9ebed4-kube-api-access-nmk56" (OuterVolumeSpecName: "kube-api-access-nmk56") pod "25348ba1-760f-46a3-9f25-4054fb9ebed4" (UID: "25348ba1-760f-46a3-9f25-4054fb9ebed4"). InnerVolumeSpecName "kube-api-access-nmk56". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:11:57 crc kubenswrapper[4807]: I1202 20:11:57.032217 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25348ba1-760f-46a3-9f25-4054fb9ebed4-util" (OuterVolumeSpecName: "util") pod "25348ba1-760f-46a3-9f25-4054fb9ebed4" (UID: "25348ba1-760f-46a3-9f25-4054fb9ebed4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:11:57 crc kubenswrapper[4807]: I1202 20:11:57.124592 4807 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/25348ba1-760f-46a3-9f25-4054fb9ebed4-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:57 crc kubenswrapper[4807]: I1202 20:11:57.124635 4807 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/25348ba1-760f-46a3-9f25-4054fb9ebed4-util\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:57 crc kubenswrapper[4807]: I1202 20:11:57.124645 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmk56\" (UniqueName: \"kubernetes.io/projected/25348ba1-760f-46a3-9f25-4054fb9ebed4-kube-api-access-nmk56\") on node \"crc\" DevicePath \"\"" Dec 02 20:11:57 crc kubenswrapper[4807]: I1202 20:11:57.492884 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" event={"ID":"25348ba1-760f-46a3-9f25-4054fb9ebed4","Type":"ContainerDied","Data":"c5a162fb2d477bd804227acd05a8db772c79dc271f17259d0a419726ecb66b9c"} Dec 02 20:11:57 crc kubenswrapper[4807]: I1202 20:11:57.492967 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5a162fb2d477bd804227acd05a8db772c79dc271f17259d0a419726ecb66b9c" Dec 02 20:11:57 crc kubenswrapper[4807]: I1202 20:11:57.493951 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc" Dec 02 20:12:06 crc kubenswrapper[4807]: E1202 20:12:06.030857 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25348ba1_760f_46a3_9f25_4054fb9ebed4.slice/crio-conmon-623c7e0f64720a1d4899ad8c28540e1817ac06ba52a6eb4efb97f46765f00fc2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25348ba1_760f_46a3_9f25_4054fb9ebed4.slice/crio-623c7e0f64720a1d4899ad8c28540e1817ac06ba52a6eb4efb97f46765f00fc2.scope\": RecentStats: unable to find data in memory cache]" Dec 02 20:12:06 crc kubenswrapper[4807]: I1202 20:12:06.278336 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:12:06 crc kubenswrapper[4807]: I1202 20:12:06.278397 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:12:06 crc kubenswrapper[4807]: I1202 20:12:06.495179 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:12:06 crc kubenswrapper[4807]: I1202 20:12:06.752169 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:12:06 crc kubenswrapper[4807]: I1202 20:12:06.880673 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7fbl6"] Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.638974 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7fbl6" podUID="fa686bc1-b30b-4041-952e-cc0280094d85" containerName="registry-server" containerID="cri-o://1e25b5b635f539a9cf96e1d2d002a08f3e020e4e801783d62354c3fe91a18340" gracePeriod=2 Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.861287 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-wv8s9"] Dec 02 20:12:08 crc kubenswrapper[4807]: E1202 20:12:08.861524 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25348ba1-760f-46a3-9f25-4054fb9ebed4" containerName="extract" Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.861537 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="25348ba1-760f-46a3-9f25-4054fb9ebed4" containerName="extract" Dec 02 20:12:08 crc kubenswrapper[4807]: E1202 20:12:08.861549 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25348ba1-760f-46a3-9f25-4054fb9ebed4" containerName="util" Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.861555 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="25348ba1-760f-46a3-9f25-4054fb9ebed4" containerName="util" Dec 02 20:12:08 crc kubenswrapper[4807]: E1202 20:12:08.861567 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25348ba1-760f-46a3-9f25-4054fb9ebed4" containerName="pull" Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.861573 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="25348ba1-760f-46a3-9f25-4054fb9ebed4" containerName="pull" Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.861667 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="25348ba1-760f-46a3-9f25-4054fb9ebed4" containerName="extract" Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.862091 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wv8s9" Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.866500 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.870619 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-xphht" Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.878861 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-wv8s9"] Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.884541 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.899423 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-267k4\" (UniqueName: \"kubernetes.io/projected/11697d44-d0d8-49be-ada1-7de7ab69950b-kube-api-access-267k4\") pod \"obo-prometheus-operator-668cf9dfbb-wv8s9\" (UID: \"11697d44-d0d8-49be-ada1-7de7ab69950b\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wv8s9" Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.949996 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw"] Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.950864 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw" Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.964307 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-bnqg5" Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.964611 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 02 20:12:08 crc kubenswrapper[4807]: I1202 20:12:08.992519 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw"] Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.001017 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d45946cf-5cdf-4461-a0c0-c90b1367e919-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw\" (UID: \"d45946cf-5cdf-4461-a0c0-c90b1367e919\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.001100 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-267k4\" (UniqueName: \"kubernetes.io/projected/11697d44-d0d8-49be-ada1-7de7ab69950b-kube-api-access-267k4\") pod \"obo-prometheus-operator-668cf9dfbb-wv8s9\" (UID: \"11697d44-d0d8-49be-ada1-7de7ab69950b\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wv8s9" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.001170 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d45946cf-5cdf-4461-a0c0-c90b1367e919-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw\" (UID: \"d45946cf-5cdf-4461-a0c0-c90b1367e919\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.005684 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr"] Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.013250 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.046647 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-267k4\" (UniqueName: \"kubernetes.io/projected/11697d44-d0d8-49be-ada1-7de7ab69950b-kube-api-access-267k4\") pod \"obo-prometheus-operator-668cf9dfbb-wv8s9\" (UID: \"11697d44-d0d8-49be-ada1-7de7ab69950b\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wv8s9" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.109419 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a50d34d7-348c-4533-8b5b-8c5f3ee88af3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr\" (UID: \"a50d34d7-348c-4533-8b5b-8c5f3ee88af3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.109525 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d45946cf-5cdf-4461-a0c0-c90b1367e919-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw\" (UID: \"d45946cf-5cdf-4461-a0c0-c90b1367e919\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.109581 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a50d34d7-348c-4533-8b5b-8c5f3ee88af3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr\" (UID: \"a50d34d7-348c-4533-8b5b-8c5f3ee88af3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.109627 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d45946cf-5cdf-4461-a0c0-c90b1367e919-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw\" (UID: \"d45946cf-5cdf-4461-a0c0-c90b1367e919\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.127293 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d45946cf-5cdf-4461-a0c0-c90b1367e919-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw\" (UID: \"d45946cf-5cdf-4461-a0c0-c90b1367e919\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.141761 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d45946cf-5cdf-4461-a0c0-c90b1367e919-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw\" (UID: \"d45946cf-5cdf-4461-a0c0-c90b1367e919\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.160657 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr"] Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.179000 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wv8s9" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.204828 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-m69t7"] Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.205637 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-m69t7" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.211995 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a50d34d7-348c-4533-8b5b-8c5f3ee88af3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr\" (UID: \"a50d34d7-348c-4533-8b5b-8c5f3ee88af3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.212098 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a50d34d7-348c-4533-8b5b-8c5f3ee88af3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr\" (UID: \"a50d34d7-348c-4533-8b5b-8c5f3ee88af3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.217521 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a50d34d7-348c-4533-8b5b-8c5f3ee88af3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr\" (UID: \"a50d34d7-348c-4533-8b5b-8c5f3ee88af3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.226853 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-k8zdx" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.227052 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.228070 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a50d34d7-348c-4533-8b5b-8c5f3ee88af3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr\" (UID: \"a50d34d7-348c-4533-8b5b-8c5f3ee88af3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.236170 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-m69t7"] Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.308023 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.318392 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/818c8714-3224-4307-92c3-efc98ece9f1d-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-m69t7\" (UID: \"818c8714-3224-4307-92c3-efc98ece9f1d\") " pod="openshift-operators/observability-operator-d8bb48f5d-m69t7" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.318600 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lpck\" (UniqueName: \"kubernetes.io/projected/818c8714-3224-4307-92c3-efc98ece9f1d-kube-api-access-7lpck\") pod \"observability-operator-d8bb48f5d-m69t7\" (UID: \"818c8714-3224-4307-92c3-efc98ece9f1d\") " pod="openshift-operators/observability-operator-d8bb48f5d-m69t7" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.359966 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-glnxl"] Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.364890 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-glnxl" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.368173 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-xthch" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.373231 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-glnxl"] Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.373592 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.421823 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/818c8714-3224-4307-92c3-efc98ece9f1d-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-m69t7\" (UID: \"818c8714-3224-4307-92c3-efc98ece9f1d\") " pod="openshift-operators/observability-operator-d8bb48f5d-m69t7" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.423123 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lpck\" (UniqueName: \"kubernetes.io/projected/818c8714-3224-4307-92c3-efc98ece9f1d-kube-api-access-7lpck\") pod \"observability-operator-d8bb48f5d-m69t7\" (UID: \"818c8714-3224-4307-92c3-efc98ece9f1d\") " pod="openshift-operators/observability-operator-d8bb48f5d-m69t7" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.423169 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q95h9\" (UniqueName: \"kubernetes.io/projected/6d0398c9-073c-437e-a5cf-e8abec984ebe-kube-api-access-q95h9\") pod \"perses-operator-5446b9c989-glnxl\" (UID: \"6d0398c9-073c-437e-a5cf-e8abec984ebe\") " pod="openshift-operators/perses-operator-5446b9c989-glnxl" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.423275 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/6d0398c9-073c-437e-a5cf-e8abec984ebe-openshift-service-ca\") pod \"perses-operator-5446b9c989-glnxl\" (UID: \"6d0398c9-073c-437e-a5cf-e8abec984ebe\") " pod="openshift-operators/perses-operator-5446b9c989-glnxl" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.429609 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/818c8714-3224-4307-92c3-efc98ece9f1d-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-m69t7\" (UID: \"818c8714-3224-4307-92c3-efc98ece9f1d\") " pod="openshift-operators/observability-operator-d8bb48f5d-m69t7" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.434600 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.444432 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lpck\" (UniqueName: \"kubernetes.io/projected/818c8714-3224-4307-92c3-efc98ece9f1d-kube-api-access-7lpck\") pod \"observability-operator-d8bb48f5d-m69t7\" (UID: \"818c8714-3224-4307-92c3-efc98ece9f1d\") " pod="openshift-operators/observability-operator-d8bb48f5d-m69t7" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.526265 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa686bc1-b30b-4041-952e-cc0280094d85-catalog-content\") pod \"fa686bc1-b30b-4041-952e-cc0280094d85\" (UID: \"fa686bc1-b30b-4041-952e-cc0280094d85\") " Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.526846 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s4rt\" (UniqueName: \"kubernetes.io/projected/fa686bc1-b30b-4041-952e-cc0280094d85-kube-api-access-5s4rt\") pod \"fa686bc1-b30b-4041-952e-cc0280094d85\" (UID: \"fa686bc1-b30b-4041-952e-cc0280094d85\") " Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.526920 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa686bc1-b30b-4041-952e-cc0280094d85-utilities\") pod \"fa686bc1-b30b-4041-952e-cc0280094d85\" (UID: \"fa686bc1-b30b-4041-952e-cc0280094d85\") " Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.527184 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q95h9\" (UniqueName: \"kubernetes.io/projected/6d0398c9-073c-437e-a5cf-e8abec984ebe-kube-api-access-q95h9\") pod \"perses-operator-5446b9c989-glnxl\" (UID: \"6d0398c9-073c-437e-a5cf-e8abec984ebe\") " pod="openshift-operators/perses-operator-5446b9c989-glnxl" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.527278 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/6d0398c9-073c-437e-a5cf-e8abec984ebe-openshift-service-ca\") pod \"perses-operator-5446b9c989-glnxl\" (UID: \"6d0398c9-073c-437e-a5cf-e8abec984ebe\") " pod="openshift-operators/perses-operator-5446b9c989-glnxl" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.528467 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/6d0398c9-073c-437e-a5cf-e8abec984ebe-openshift-service-ca\") pod \"perses-operator-5446b9c989-glnxl\" (UID: \"6d0398c9-073c-437e-a5cf-e8abec984ebe\") " pod="openshift-operators/perses-operator-5446b9c989-glnxl" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.530567 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa686bc1-b30b-4041-952e-cc0280094d85-utilities" (OuterVolumeSpecName: "utilities") pod "fa686bc1-b30b-4041-952e-cc0280094d85" (UID: "fa686bc1-b30b-4041-952e-cc0280094d85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.540057 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa686bc1-b30b-4041-952e-cc0280094d85-kube-api-access-5s4rt" (OuterVolumeSpecName: "kube-api-access-5s4rt") pod "fa686bc1-b30b-4041-952e-cc0280094d85" (UID: "fa686bc1-b30b-4041-952e-cc0280094d85"). InnerVolumeSpecName "kube-api-access-5s4rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.564613 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q95h9\" (UniqueName: \"kubernetes.io/projected/6d0398c9-073c-437e-a5cf-e8abec984ebe-kube-api-access-q95h9\") pod \"perses-operator-5446b9c989-glnxl\" (UID: \"6d0398c9-073c-437e-a5cf-e8abec984ebe\") " pod="openshift-operators/perses-operator-5446b9c989-glnxl" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.619081 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-m69t7" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.630778 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s4rt\" (UniqueName: \"kubernetes.io/projected/fa686bc1-b30b-4041-952e-cc0280094d85-kube-api-access-5s4rt\") on node \"crc\" DevicePath \"\"" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.630810 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa686bc1-b30b-4041-952e-cc0280094d85-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.705781 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa686bc1-b30b-4041-952e-cc0280094d85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa686bc1-b30b-4041-952e-cc0280094d85" (UID: "fa686bc1-b30b-4041-952e-cc0280094d85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.711009 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-wv8s9"] Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.719535 4807 generic.go:334] "Generic (PLEG): container finished" podID="fa686bc1-b30b-4041-952e-cc0280094d85" containerID="1e25b5b635f539a9cf96e1d2d002a08f3e020e4e801783d62354c3fe91a18340" exitCode=0 Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.719613 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fbl6" event={"ID":"fa686bc1-b30b-4041-952e-cc0280094d85","Type":"ContainerDied","Data":"1e25b5b635f539a9cf96e1d2d002a08f3e020e4e801783d62354c3fe91a18340"} Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.719752 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7fbl6" event={"ID":"fa686bc1-b30b-4041-952e-cc0280094d85","Type":"ContainerDied","Data":"1db0f4b90d5f651da6b5a73a49fc289bc1e25ca9a6cb6a86c4ab1e2348a60f78"} Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.719782 4807 scope.go:117] "RemoveContainer" containerID="1e25b5b635f539a9cf96e1d2d002a08f3e020e4e801783d62354c3fe91a18340" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.719783 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7fbl6" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.731801 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa686bc1-b30b-4041-952e-cc0280094d85-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.759197 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-glnxl" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.770247 4807 scope.go:117] "RemoveContainer" containerID="4b481fdf211b00336c494d2ff2e18c0544e37cedc69f28e0baa02271ddd0e80b" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.876931 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7fbl6"] Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.891100 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7fbl6"] Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.926450 4807 scope.go:117] "RemoveContainer" containerID="6942dd8686b085a959ff0d6732b430281a2998c2e64c27a823a07a09aa9bc167" Dec 02 20:12:09 crc kubenswrapper[4807]: I1202 20:12:09.996698 4807 scope.go:117] "RemoveContainer" containerID="1e25b5b635f539a9cf96e1d2d002a08f3e020e4e801783d62354c3fe91a18340" Dec 02 20:12:10 crc kubenswrapper[4807]: E1202 20:12:10.008886 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e25b5b635f539a9cf96e1d2d002a08f3e020e4e801783d62354c3fe91a18340\": container with ID starting with 1e25b5b635f539a9cf96e1d2d002a08f3e020e4e801783d62354c3fe91a18340 not found: ID does not exist" containerID="1e25b5b635f539a9cf96e1d2d002a08f3e020e4e801783d62354c3fe91a18340" Dec 02 20:12:10 crc kubenswrapper[4807]: I1202 20:12:10.008954 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e25b5b635f539a9cf96e1d2d002a08f3e020e4e801783d62354c3fe91a18340"} err="failed to get container status \"1e25b5b635f539a9cf96e1d2d002a08f3e020e4e801783d62354c3fe91a18340\": rpc error: code = NotFound desc = could not find container \"1e25b5b635f539a9cf96e1d2d002a08f3e020e4e801783d62354c3fe91a18340\": container with ID starting with 1e25b5b635f539a9cf96e1d2d002a08f3e020e4e801783d62354c3fe91a18340 not found: ID does not exist" Dec 02 20:12:10 crc kubenswrapper[4807]: I1202 20:12:10.008984 4807 scope.go:117] "RemoveContainer" containerID="4b481fdf211b00336c494d2ff2e18c0544e37cedc69f28e0baa02271ddd0e80b" Dec 02 20:12:10 crc kubenswrapper[4807]: E1202 20:12:10.009973 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b481fdf211b00336c494d2ff2e18c0544e37cedc69f28e0baa02271ddd0e80b\": container with ID starting with 4b481fdf211b00336c494d2ff2e18c0544e37cedc69f28e0baa02271ddd0e80b not found: ID does not exist" containerID="4b481fdf211b00336c494d2ff2e18c0544e37cedc69f28e0baa02271ddd0e80b" Dec 02 20:12:10 crc kubenswrapper[4807]: I1202 20:12:10.009991 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b481fdf211b00336c494d2ff2e18c0544e37cedc69f28e0baa02271ddd0e80b"} err="failed to get container status \"4b481fdf211b00336c494d2ff2e18c0544e37cedc69f28e0baa02271ddd0e80b\": rpc error: code = NotFound desc = could not find container \"4b481fdf211b00336c494d2ff2e18c0544e37cedc69f28e0baa02271ddd0e80b\": container with ID starting with 4b481fdf211b00336c494d2ff2e18c0544e37cedc69f28e0baa02271ddd0e80b not found: ID does not exist" Dec 02 20:12:10 crc kubenswrapper[4807]: I1202 20:12:10.010004 4807 scope.go:117] "RemoveContainer" containerID="6942dd8686b085a959ff0d6732b430281a2998c2e64c27a823a07a09aa9bc167" Dec 02 20:12:10 crc kubenswrapper[4807]: E1202 20:12:10.015974 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6942dd8686b085a959ff0d6732b430281a2998c2e64c27a823a07a09aa9bc167\": container with ID starting with 6942dd8686b085a959ff0d6732b430281a2998c2e64c27a823a07a09aa9bc167 not found: ID does not exist" containerID="6942dd8686b085a959ff0d6732b430281a2998c2e64c27a823a07a09aa9bc167" Dec 02 20:12:10 crc kubenswrapper[4807]: I1202 20:12:10.016023 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6942dd8686b085a959ff0d6732b430281a2998c2e64c27a823a07a09aa9bc167"} err="failed to get container status \"6942dd8686b085a959ff0d6732b430281a2998c2e64c27a823a07a09aa9bc167\": rpc error: code = NotFound desc = could not find container \"6942dd8686b085a959ff0d6732b430281a2998c2e64c27a823a07a09aa9bc167\": container with ID starting with 6942dd8686b085a959ff0d6732b430281a2998c2e64c27a823a07a09aa9bc167 not found: ID does not exist" Dec 02 20:12:10 crc kubenswrapper[4807]: I1202 20:12:10.058266 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw"] Dec 02 20:12:10 crc kubenswrapper[4807]: I1202 20:12:10.123554 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr"] Dec 02 20:12:10 crc kubenswrapper[4807]: W1202 20:12:10.129660 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda50d34d7_348c_4533_8b5b_8c5f3ee88af3.slice/crio-8bfc0a558b874b629dc44ee0232d1243fb01f3551ceb2a1c3015bd1cc55ed521 WatchSource:0}: Error finding container 8bfc0a558b874b629dc44ee0232d1243fb01f3551ceb2a1c3015bd1cc55ed521: Status 404 returned error can't find the container with id 8bfc0a558b874b629dc44ee0232d1243fb01f3551ceb2a1c3015bd1cc55ed521 Dec 02 20:12:10 crc kubenswrapper[4807]: I1202 20:12:10.169909 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-m69t7"] Dec 02 20:12:10 crc kubenswrapper[4807]: W1202 20:12:10.179255 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod818c8714_3224_4307_92c3_efc98ece9f1d.slice/crio-7b110077f170524641ec2567b22883b4894da51f8d98004b30a92c5c44aa7448 WatchSource:0}: Error finding container 7b110077f170524641ec2567b22883b4894da51f8d98004b30a92c5c44aa7448: Status 404 returned error can't find the container with id 7b110077f170524641ec2567b22883b4894da51f8d98004b30a92c5c44aa7448 Dec 02 20:12:10 crc kubenswrapper[4807]: I1202 20:12:10.260180 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-glnxl"] Dec 02 20:12:10 crc kubenswrapper[4807]: W1202 20:12:10.268639 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d0398c9_073c_437e_a5cf_e8abec984ebe.slice/crio-8fa024bdafde3631884c83f6879a7d1dc6c6e56d21807b3c6e3bbca8857be858 WatchSource:0}: Error finding container 8fa024bdafde3631884c83f6879a7d1dc6c6e56d21807b3c6e3bbca8857be858: Status 404 returned error can't find the container with id 8fa024bdafde3631884c83f6879a7d1dc6c6e56d21807b3c6e3bbca8857be858 Dec 02 20:12:10 crc kubenswrapper[4807]: I1202 20:12:10.729833 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-m69t7" event={"ID":"818c8714-3224-4307-92c3-efc98ece9f1d","Type":"ContainerStarted","Data":"7b110077f170524641ec2567b22883b4894da51f8d98004b30a92c5c44aa7448"} Dec 02 20:12:10 crc kubenswrapper[4807]: I1202 20:12:10.731256 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wv8s9" event={"ID":"11697d44-d0d8-49be-ada1-7de7ab69950b","Type":"ContainerStarted","Data":"065d19c8d9899e0ef86a0e86366c64638dda441ad8e1e7489033955b3943ef79"} Dec 02 20:12:10 crc kubenswrapper[4807]: I1202 20:12:10.733854 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr" event={"ID":"a50d34d7-348c-4533-8b5b-8c5f3ee88af3","Type":"ContainerStarted","Data":"8bfc0a558b874b629dc44ee0232d1243fb01f3551ceb2a1c3015bd1cc55ed521"} Dec 02 20:12:10 crc kubenswrapper[4807]: I1202 20:12:10.734970 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw" event={"ID":"d45946cf-5cdf-4461-a0c0-c90b1367e919","Type":"ContainerStarted","Data":"164fb17ced3b84d6ab948c7b67f2c08b4ac9e48c48b4a1e3eaf0cd886729f9f2"} Dec 02 20:12:10 crc kubenswrapper[4807]: I1202 20:12:10.736171 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-glnxl" event={"ID":"6d0398c9-073c-437e-a5cf-e8abec984ebe","Type":"ContainerStarted","Data":"8fa024bdafde3631884c83f6879a7d1dc6c6e56d21807b3c6e3bbca8857be858"} Dec 02 20:12:10 crc kubenswrapper[4807]: I1202 20:12:10.985001 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa686bc1-b30b-4041-952e-cc0280094d85" path="/var/lib/kubelet/pods/fa686bc1-b30b-4041-952e-cc0280094d85/volumes" Dec 02 20:12:16 crc kubenswrapper[4807]: E1202 20:12:16.376897 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25348ba1_760f_46a3_9f25_4054fb9ebed4.slice/crio-conmon-623c7e0f64720a1d4899ad8c28540e1817ac06ba52a6eb4efb97f46765f00fc2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25348ba1_760f_46a3_9f25_4054fb9ebed4.slice/crio-623c7e0f64720a1d4899ad8c28540e1817ac06ba52a6eb4efb97f46765f00fc2.scope\": RecentStats: unable to find data in memory cache]" Dec 02 20:12:26 crc kubenswrapper[4807]: E1202 20:12:26.559498 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25348ba1_760f_46a3_9f25_4054fb9ebed4.slice/crio-623c7e0f64720a1d4899ad8c28540e1817ac06ba52a6eb4efb97f46765f00fc2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25348ba1_760f_46a3_9f25_4054fb9ebed4.slice/crio-conmon-623c7e0f64720a1d4899ad8c28540e1817ac06ba52a6eb4efb97f46765f00fc2.scope\": RecentStats: unable to find data in memory cache]" Dec 02 20:12:27 crc kubenswrapper[4807]: E1202 20:12:27.100305 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3" Dec 02 20:12:27 crc kubenswrapper[4807]: E1202 20:12:27.100983 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:1133c973c7472c665f910a722e19c8e2e27accb34b90fab67f14548627ce9c62,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-267k4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-668cf9dfbb-wv8s9_openshift-operators(11697d44-d0d8-49be-ada1-7de7ab69950b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 20:12:27 crc kubenswrapper[4807]: E1202 20:12:27.102186 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wv8s9" podUID="11697d44-d0d8-49be-ada1-7de7ab69950b" Dec 02 20:12:27 crc kubenswrapper[4807]: E1202 20:12:27.784592 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec" Dec 02 20:12:27 crc kubenswrapper[4807]: E1202 20:12:27.784792 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw_openshift-operators(d45946cf-5cdf-4461-a0c0-c90b1367e919): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 20:12:27 crc kubenswrapper[4807]: E1202 20:12:27.785939 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw" podUID="d45946cf-5cdf-4461-a0c0-c90b1367e919" Dec 02 20:12:27 crc kubenswrapper[4807]: E1202 20:12:27.934169 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw" podUID="d45946cf-5cdf-4461-a0c0-c90b1367e919" Dec 02 20:12:27 crc kubenswrapper[4807]: E1202 20:12:27.934430 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3\\\"\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wv8s9" podUID="11697d44-d0d8-49be-ada1-7de7ab69950b" Dec 02 20:12:30 crc kubenswrapper[4807]: E1202 20:12:30.832096 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec" Dec 02 20:12:30 crc kubenswrapper[4807]: E1202 20:12:30.832644 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr_openshift-operators(a50d34d7-348c-4533-8b5b-8c5f3ee88af3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 20:12:30 crc kubenswrapper[4807]: E1202 20:12:30.834196 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr" podUID="a50d34d7-348c-4533-8b5b-8c5f3ee88af3" Dec 02 20:12:30 crc kubenswrapper[4807]: E1202 20:12:30.842749 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb" Dec 02 20:12:30 crc kubenswrapper[4807]: E1202 20:12:30.843067 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb,Command:[],Args:[--namespace=$(NAMESPACE) --images=perses=$(RELATED_IMAGE_PERSES) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:e718854a7d6ca8accf0fa72db0eb902e46c44d747ad51dc3f06bba0cefaa3c01,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:d972f4faa5e9c121402d23ed85002f26af48ec36b1b71a7489d677b3913d08b4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-rhel9@sha256:91531137fc1dcd740e277e0f65e120a0176a16f788c14c27925b61aa0b792ade,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:a69da8bbca8a28dd2925f864d51cc31cf761b10532c553095ba40b242ef701cb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-rhel9@sha256:897e1bfad1187062725b54d87107bd0155972257a50d8335dd29e1999b828a4f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf5-rhel9@sha256:95fe5b5746ca8c07ac9217ce2d8ac8e6afad17af210f9d8e0074df1310b209a8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf4-rhel9@sha256:e9d9a89e4d8126a62b1852055482258ee528cac6398dd5d43ebad75ace0f33c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-rhel9@sha256:ec684a0645ceb917b019af7ddba68c3533416e356ab0d0320a30e75ca7ebb31b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-pf4-rhel9@sha256:3b9693fcde9b3a9494fb04735b1f7cfd0426f10be820fdc3f024175c0d3df1c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-rhel9@sha256:580606f194180accc8abba099e17a26dca7522ec6d233fa2fdd40312771703e3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-rhel9@sha256:e03777be39e71701935059cd877603874a13ac94daa73219d4e5e545599d78a9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-pf5-rhel9@sha256:aa47256193cfd2877853878e1ae97d2ab8b8e5deae62b387cbfad02b284d379c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:c595ff56b2cb85514bf4784db6ddb82e4e657e3e708a7fb695fc4997379a94d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:45a4ec2a519bcec99e886aa91596d5356a2414a2bd103baaef9fa7838c672eb2,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7lpck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-d8bb48f5d-m69t7_openshift-operators(818c8714-3224-4307-92c3-efc98ece9f1d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 20:12:30 crc kubenswrapper[4807]: E1202 20:12:30.844242 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-d8bb48f5d-m69t7" podUID="818c8714-3224-4307-92c3-efc98ece9f1d" Dec 02 20:12:30 crc kubenswrapper[4807]: E1202 20:12:30.961906 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr" podUID="a50d34d7-348c-4533-8b5b-8c5f3ee88af3" Dec 02 20:12:30 crc kubenswrapper[4807]: E1202 20:12:30.961949 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb\\\"\"" pod="openshift-operators/observability-operator-d8bb48f5d-m69t7" podUID="818c8714-3224-4307-92c3-efc98ece9f1d" Dec 02 20:12:31 crc kubenswrapper[4807]: E1202 20:12:31.425036 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385" Dec 02 20:12:31 crc kubenswrapper[4807]: E1202 20:12:31.425338 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q95h9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-5446b9c989-glnxl_openshift-operators(6d0398c9-073c-437e-a5cf-e8abec984ebe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 20:12:31 crc kubenswrapper[4807]: E1202 20:12:31.427015 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-5446b9c989-glnxl" podUID="6d0398c9-073c-437e-a5cf-e8abec984ebe" Dec 02 20:12:31 crc kubenswrapper[4807]: E1202 20:12:31.966655 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385\\\"\"" pod="openshift-operators/perses-operator-5446b9c989-glnxl" podUID="6d0398c9-073c-437e-a5cf-e8abec984ebe" Dec 02 20:12:36 crc kubenswrapper[4807]: E1202 20:12:36.688932 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25348ba1_760f_46a3_9f25_4054fb9ebed4.slice/crio-conmon-623c7e0f64720a1d4899ad8c28540e1817ac06ba52a6eb4efb97f46765f00fc2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25348ba1_760f_46a3_9f25_4054fb9ebed4.slice/crio-623c7e0f64720a1d4899ad8c28540e1817ac06ba52a6eb4efb97f46765f00fc2.scope\": RecentStats: unable to find data in memory cache]" Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.667690 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dtq86"] Dec 02 20:12:39 crc kubenswrapper[4807]: E1202 20:12:39.668247 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa686bc1-b30b-4041-952e-cc0280094d85" containerName="extract-content" Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.668260 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa686bc1-b30b-4041-952e-cc0280094d85" containerName="extract-content" Dec 02 20:12:39 crc kubenswrapper[4807]: E1202 20:12:39.668278 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa686bc1-b30b-4041-952e-cc0280094d85" containerName="extract-utilities" Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.668284 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa686bc1-b30b-4041-952e-cc0280094d85" containerName="extract-utilities" Dec 02 20:12:39 crc kubenswrapper[4807]: E1202 20:12:39.668295 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa686bc1-b30b-4041-952e-cc0280094d85" containerName="registry-server" Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.668302 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa686bc1-b30b-4041-952e-cc0280094d85" containerName="registry-server" Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.668394 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa686bc1-b30b-4041-952e-cc0280094d85" containerName="registry-server" Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.669152 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.685004 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dtq86"] Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.811228 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d40a0534-b58e-4a62-b3a1-5415909d89ca-utilities\") pod \"redhat-marketplace-dtq86\" (UID: \"d40a0534-b58e-4a62-b3a1-5415909d89ca\") " pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.811283 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d46k\" (UniqueName: \"kubernetes.io/projected/d40a0534-b58e-4a62-b3a1-5415909d89ca-kube-api-access-4d46k\") pod \"redhat-marketplace-dtq86\" (UID: \"d40a0534-b58e-4a62-b3a1-5415909d89ca\") " pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.811541 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d40a0534-b58e-4a62-b3a1-5415909d89ca-catalog-content\") pod \"redhat-marketplace-dtq86\" (UID: \"d40a0534-b58e-4a62-b3a1-5415909d89ca\") " pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.913257 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d40a0534-b58e-4a62-b3a1-5415909d89ca-catalog-content\") pod \"redhat-marketplace-dtq86\" (UID: \"d40a0534-b58e-4a62-b3a1-5415909d89ca\") " pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.913697 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d40a0534-b58e-4a62-b3a1-5415909d89ca-utilities\") pod \"redhat-marketplace-dtq86\" (UID: \"d40a0534-b58e-4a62-b3a1-5415909d89ca\") " pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.913848 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d46k\" (UniqueName: \"kubernetes.io/projected/d40a0534-b58e-4a62-b3a1-5415909d89ca-kube-api-access-4d46k\") pod \"redhat-marketplace-dtq86\" (UID: \"d40a0534-b58e-4a62-b3a1-5415909d89ca\") " pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.913793 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d40a0534-b58e-4a62-b3a1-5415909d89ca-catalog-content\") pod \"redhat-marketplace-dtq86\" (UID: \"d40a0534-b58e-4a62-b3a1-5415909d89ca\") " pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.914228 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d40a0534-b58e-4a62-b3a1-5415909d89ca-utilities\") pod \"redhat-marketplace-dtq86\" (UID: \"d40a0534-b58e-4a62-b3a1-5415909d89ca\") " pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.938037 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d46k\" (UniqueName: \"kubernetes.io/projected/d40a0534-b58e-4a62-b3a1-5415909d89ca-kube-api-access-4d46k\") pod \"redhat-marketplace-dtq86\" (UID: \"d40a0534-b58e-4a62-b3a1-5415909d89ca\") " pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:39 crc kubenswrapper[4807]: I1202 20:12:39.988231 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:40 crc kubenswrapper[4807]: I1202 20:12:40.254272 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dtq86"] Dec 02 20:12:40 crc kubenswrapper[4807]: W1202 20:12:40.268067 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd40a0534_b58e_4a62_b3a1_5415909d89ca.slice/crio-9e722263c628d39fed6d5767a6f3d9c0aad3de754e0cc4df40bb3c4bf52e983c WatchSource:0}: Error finding container 9e722263c628d39fed6d5767a6f3d9c0aad3de754e0cc4df40bb3c4bf52e983c: Status 404 returned error can't find the container with id 9e722263c628d39fed6d5767a6f3d9c0aad3de754e0cc4df40bb3c4bf52e983c Dec 02 20:12:41 crc kubenswrapper[4807]: I1202 20:12:41.019268 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw" event={"ID":"d45946cf-5cdf-4461-a0c0-c90b1367e919","Type":"ContainerStarted","Data":"2a497b32b7fdb9136c7e0a938aba3e52308e188479721b6d52955c2942f7957a"} Dec 02 20:12:41 crc kubenswrapper[4807]: I1202 20:12:41.021354 4807 generic.go:334] "Generic (PLEG): container finished" podID="d40a0534-b58e-4a62-b3a1-5415909d89ca" containerID="974fb0c3f5278a94df53a3be912219508b64e45c539d6ce04026e5d8bdbf2c2f" exitCode=0 Dec 02 20:12:41 crc kubenswrapper[4807]: I1202 20:12:41.021390 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtq86" event={"ID":"d40a0534-b58e-4a62-b3a1-5415909d89ca","Type":"ContainerDied","Data":"974fb0c3f5278a94df53a3be912219508b64e45c539d6ce04026e5d8bdbf2c2f"} Dec 02 20:12:41 crc kubenswrapper[4807]: I1202 20:12:41.021412 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtq86" event={"ID":"d40a0534-b58e-4a62-b3a1-5415909d89ca","Type":"ContainerStarted","Data":"9e722263c628d39fed6d5767a6f3d9c0aad3de754e0cc4df40bb3c4bf52e983c"} Dec 02 20:12:41 crc kubenswrapper[4807]: I1202 20:12:41.044246 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw" podStartSLOduration=2.516993867 podStartE2EDuration="33.04422342s" podCreationTimestamp="2025-12-02 20:12:08 +0000 UTC" firstStartedPulling="2025-12-02 20:12:10.097161747 +0000 UTC m=+865.398069242" lastFinishedPulling="2025-12-02 20:12:40.6243913 +0000 UTC m=+895.925298795" observedRunningTime="2025-12-02 20:12:41.041632116 +0000 UTC m=+896.342539621" watchObservedRunningTime="2025-12-02 20:12:41.04422342 +0000 UTC m=+896.345130915" Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.141158 4807 generic.go:334] "Generic (PLEG): container finished" podID="d40a0534-b58e-4a62-b3a1-5415909d89ca" containerID="40e3c42c30a2521ed0b01de341dc35b22a97eda4a71dfd0682f99d7164e2fda8" exitCode=0 Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.141318 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtq86" event={"ID":"d40a0534-b58e-4a62-b3a1-5415909d89ca","Type":"ContainerDied","Data":"40e3c42c30a2521ed0b01de341dc35b22a97eda4a71dfd0682f99d7164e2fda8"} Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.144852 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-m69t7" event={"ID":"818c8714-3224-4307-92c3-efc98ece9f1d","Type":"ContainerStarted","Data":"580c1ed6ec09ba09af970077a797874971503d8e090c6c9e920ba09a419b855c"} Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.145677 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-m69t7" Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.147829 4807 patch_prober.go:28] interesting pod/observability-operator-d8bb48f5d-m69t7 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.43:8081/healthz\": dial tcp 10.217.0.43:8081: connect: connection refused" start-of-body= Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.147889 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-d8bb48f5d-m69t7" podUID="818c8714-3224-4307-92c3-efc98ece9f1d" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.43:8081/healthz\": dial tcp 10.217.0.43:8081: connect: connection refused" Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.200914 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-m69t7" podStartSLOduration=1.741754393 podStartE2EDuration="34.200890899s" podCreationTimestamp="2025-12-02 20:12:09 +0000 UTC" firstStartedPulling="2025-12-02 20:12:10.183661398 +0000 UTC m=+865.484568893" lastFinishedPulling="2025-12-02 20:12:42.642797904 +0000 UTC m=+897.943705399" observedRunningTime="2025-12-02 20:12:43.198252403 +0000 UTC m=+898.499159898" watchObservedRunningTime="2025-12-02 20:12:43.200890899 +0000 UTC m=+898.501798394" Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.463280 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rhm5n"] Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.464918 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.480957 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rhm5n"] Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.572090 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d05ff98-e881-4a16-a902-09c3afd1bf61-catalog-content\") pod \"certified-operators-rhm5n\" (UID: \"2d05ff98-e881-4a16-a902-09c3afd1bf61\") " pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.572684 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d05ff98-e881-4a16-a902-09c3afd1bf61-utilities\") pod \"certified-operators-rhm5n\" (UID: \"2d05ff98-e881-4a16-a902-09c3afd1bf61\") " pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.572764 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr4p8\" (UniqueName: \"kubernetes.io/projected/2d05ff98-e881-4a16-a902-09c3afd1bf61-kube-api-access-vr4p8\") pod \"certified-operators-rhm5n\" (UID: \"2d05ff98-e881-4a16-a902-09c3afd1bf61\") " pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.675406 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d05ff98-e881-4a16-a902-09c3afd1bf61-utilities\") pod \"certified-operators-rhm5n\" (UID: \"2d05ff98-e881-4a16-a902-09c3afd1bf61\") " pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.675512 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr4p8\" (UniqueName: \"kubernetes.io/projected/2d05ff98-e881-4a16-a902-09c3afd1bf61-kube-api-access-vr4p8\") pod \"certified-operators-rhm5n\" (UID: \"2d05ff98-e881-4a16-a902-09c3afd1bf61\") " pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.675587 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d05ff98-e881-4a16-a902-09c3afd1bf61-catalog-content\") pod \"certified-operators-rhm5n\" (UID: \"2d05ff98-e881-4a16-a902-09c3afd1bf61\") " pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.676468 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d05ff98-e881-4a16-a902-09c3afd1bf61-catalog-content\") pod \"certified-operators-rhm5n\" (UID: \"2d05ff98-e881-4a16-a902-09c3afd1bf61\") " pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.676543 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d05ff98-e881-4a16-a902-09c3afd1bf61-utilities\") pod \"certified-operators-rhm5n\" (UID: \"2d05ff98-e881-4a16-a902-09c3afd1bf61\") " pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.705952 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr4p8\" (UniqueName: \"kubernetes.io/projected/2d05ff98-e881-4a16-a902-09c3afd1bf61-kube-api-access-vr4p8\") pod \"certified-operators-rhm5n\" (UID: \"2d05ff98-e881-4a16-a902-09c3afd1bf61\") " pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:43 crc kubenswrapper[4807]: I1202 20:12:43.782582 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:44 crc kubenswrapper[4807]: I1202 20:12:44.100276 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rhm5n"] Dec 02 20:12:44 crc kubenswrapper[4807]: I1202 20:12:44.181900 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-m69t7" Dec 02 20:12:45 crc kubenswrapper[4807]: I1202 20:12:45.159288 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtq86" event={"ID":"d40a0534-b58e-4a62-b3a1-5415909d89ca","Type":"ContainerStarted","Data":"636a7cec89badf87775131bad6dfea21c3f447c41529779862c3c0eb8d804be4"} Dec 02 20:12:45 crc kubenswrapper[4807]: I1202 20:12:45.161497 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr" event={"ID":"a50d34d7-348c-4533-8b5b-8c5f3ee88af3","Type":"ContainerStarted","Data":"d6ee4688597f7df6e409ba3e1e761e5e7ce2b931095eee57895e6ed930614c95"} Dec 02 20:12:45 crc kubenswrapper[4807]: I1202 20:12:45.163137 4807 generic.go:334] "Generic (PLEG): container finished" podID="2d05ff98-e881-4a16-a902-09c3afd1bf61" containerID="11ea49b7210901458171b75eb85c5ab74f5d79228ab2c27953088350fd9649cf" exitCode=0 Dec 02 20:12:45 crc kubenswrapper[4807]: I1202 20:12:45.163523 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhm5n" event={"ID":"2d05ff98-e881-4a16-a902-09c3afd1bf61","Type":"ContainerDied","Data":"11ea49b7210901458171b75eb85c5ab74f5d79228ab2c27953088350fd9649cf"} Dec 02 20:12:45 crc kubenswrapper[4807]: I1202 20:12:45.163549 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhm5n" event={"ID":"2d05ff98-e881-4a16-a902-09c3afd1bf61","Type":"ContainerStarted","Data":"a38ffa953c3b3054a5230188dd583fd694ea18b33321257c3dfb8feb67510cd1"} Dec 02 20:12:45 crc kubenswrapper[4807]: I1202 20:12:45.191677 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dtq86" podStartSLOduration=3.011692526 podStartE2EDuration="6.191656991s" podCreationTimestamp="2025-12-02 20:12:39 +0000 UTC" firstStartedPulling="2025-12-02 20:12:41.024126334 +0000 UTC m=+896.325033829" lastFinishedPulling="2025-12-02 20:12:44.204090799 +0000 UTC m=+899.504998294" observedRunningTime="2025-12-02 20:12:45.186203205 +0000 UTC m=+900.487110730" watchObservedRunningTime="2025-12-02 20:12:45.191656991 +0000 UTC m=+900.492564486" Dec 02 20:12:45 crc kubenswrapper[4807]: I1202 20:12:45.241369 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr" podStartSLOduration=-9223371999.613441 podStartE2EDuration="37.241333426s" podCreationTimestamp="2025-12-02 20:12:08 +0000 UTC" firstStartedPulling="2025-12-02 20:12:10.135887528 +0000 UTC m=+865.436795023" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:12:45.236792285 +0000 UTC m=+900.537699800" watchObservedRunningTime="2025-12-02 20:12:45.241333426 +0000 UTC m=+900.542240931" Dec 02 20:12:46 crc kubenswrapper[4807]: I1202 20:12:46.174479 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wv8s9" event={"ID":"11697d44-d0d8-49be-ada1-7de7ab69950b","Type":"ContainerStarted","Data":"685a20f51a3605117d57a80a7519ec4b399207e5c15ec2b39fb0d7f67881281d"} Dec 02 20:12:47 crc kubenswrapper[4807]: I1202 20:12:47.183237 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-glnxl" event={"ID":"6d0398c9-073c-437e-a5cf-e8abec984ebe","Type":"ContainerStarted","Data":"0f180be3e429347eefca917465ff02e738beb8d052be5ad4b0eb6f11c85eda0e"} Dec 02 20:12:47 crc kubenswrapper[4807]: I1202 20:12:47.184108 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-glnxl" Dec 02 20:12:47 crc kubenswrapper[4807]: I1202 20:12:47.186816 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhm5n" event={"ID":"2d05ff98-e881-4a16-a902-09c3afd1bf61","Type":"ContainerStarted","Data":"ccb1810ab4c6db598714ab64bdc7c556a6a0d5951e996a79017eec839b764c96"} Dec 02 20:12:47 crc kubenswrapper[4807]: I1202 20:12:47.236893 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-glnxl" podStartSLOduration=1.520204809 podStartE2EDuration="38.236865413s" podCreationTimestamp="2025-12-02 20:12:09 +0000 UTC" firstStartedPulling="2025-12-02 20:12:10.272203957 +0000 UTC m=+865.573111452" lastFinishedPulling="2025-12-02 20:12:46.988864561 +0000 UTC m=+902.289772056" observedRunningTime="2025-12-02 20:12:47.211364902 +0000 UTC m=+902.512272427" watchObservedRunningTime="2025-12-02 20:12:47.236865413 +0000 UTC m=+902.537772948" Dec 02 20:12:47 crc kubenswrapper[4807]: I1202 20:12:47.265910 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-wv8s9" podStartSLOduration=3.384933458 podStartE2EDuration="39.265875565s" podCreationTimestamp="2025-12-02 20:12:08 +0000 UTC" firstStartedPulling="2025-12-02 20:12:09.758955669 +0000 UTC m=+865.059863164" lastFinishedPulling="2025-12-02 20:12:45.639897776 +0000 UTC m=+900.940805271" observedRunningTime="2025-12-02 20:12:47.239433517 +0000 UTC m=+902.540341022" watchObservedRunningTime="2025-12-02 20:12:47.265875565 +0000 UTC m=+902.566783060" Dec 02 20:12:48 crc kubenswrapper[4807]: I1202 20:12:48.199081 4807 generic.go:334] "Generic (PLEG): container finished" podID="2d05ff98-e881-4a16-a902-09c3afd1bf61" containerID="ccb1810ab4c6db598714ab64bdc7c556a6a0d5951e996a79017eec839b764c96" exitCode=0 Dec 02 20:12:48 crc kubenswrapper[4807]: I1202 20:12:48.199186 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhm5n" event={"ID":"2d05ff98-e881-4a16-a902-09c3afd1bf61","Type":"ContainerDied","Data":"ccb1810ab4c6db598714ab64bdc7c556a6a0d5951e996a79017eec839b764c96"} Dec 02 20:12:49 crc kubenswrapper[4807]: I1202 20:12:49.209127 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhm5n" event={"ID":"2d05ff98-e881-4a16-a902-09c3afd1bf61","Type":"ContainerStarted","Data":"de6603ad0cf0734be8c7b0d9603e06c72507daf064663e4a018b5e3c90839cd9"} Dec 02 20:12:49 crc kubenswrapper[4807]: I1202 20:12:49.235239 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rhm5n" podStartSLOduration=2.989242675 podStartE2EDuration="6.235217003s" podCreationTimestamp="2025-12-02 20:12:43 +0000 UTC" firstStartedPulling="2025-12-02 20:12:45.55776141 +0000 UTC m=+900.858668905" lastFinishedPulling="2025-12-02 20:12:48.803735738 +0000 UTC m=+904.104643233" observedRunningTime="2025-12-02 20:12:49.231866886 +0000 UTC m=+904.532774381" watchObservedRunningTime="2025-12-02 20:12:49.235217003 +0000 UTC m=+904.536124508" Dec 02 20:12:49 crc kubenswrapper[4807]: I1202 20:12:49.988946 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:49 crc kubenswrapper[4807]: I1202 20:12:49.989026 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:50 crc kubenswrapper[4807]: I1202 20:12:50.034935 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:50 crc kubenswrapper[4807]: I1202 20:12:50.260940 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:53 crc kubenswrapper[4807]: I1202 20:12:53.059170 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dtq86"] Dec 02 20:12:53 crc kubenswrapper[4807]: I1202 20:12:53.059836 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dtq86" podUID="d40a0534-b58e-4a62-b3a1-5415909d89ca" containerName="registry-server" containerID="cri-o://636a7cec89badf87775131bad6dfea21c3f447c41529779862c3c0eb8d804be4" gracePeriod=2 Dec 02 20:12:53 crc kubenswrapper[4807]: I1202 20:12:53.783565 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:53 crc kubenswrapper[4807]: I1202 20:12:53.783629 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:53 crc kubenswrapper[4807]: I1202 20:12:53.840948 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:54 crc kubenswrapper[4807]: I1202 20:12:54.296729 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:55 crc kubenswrapper[4807]: I1202 20:12:55.253340 4807 generic.go:334] "Generic (PLEG): container finished" podID="d40a0534-b58e-4a62-b3a1-5415909d89ca" containerID="636a7cec89badf87775131bad6dfea21c3f447c41529779862c3c0eb8d804be4" exitCode=0 Dec 02 20:12:55 crc kubenswrapper[4807]: I1202 20:12:55.253561 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtq86" event={"ID":"d40a0534-b58e-4a62-b3a1-5415909d89ca","Type":"ContainerDied","Data":"636a7cec89badf87775131bad6dfea21c3f447c41529779862c3c0eb8d804be4"} Dec 02 20:12:55 crc kubenswrapper[4807]: I1202 20:12:55.253929 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dtq86" event={"ID":"d40a0534-b58e-4a62-b3a1-5415909d89ca","Type":"ContainerDied","Data":"9e722263c628d39fed6d5767a6f3d9c0aad3de754e0cc4df40bb3c4bf52e983c"} Dec 02 20:12:55 crc kubenswrapper[4807]: I1202 20:12:55.253947 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e722263c628d39fed6d5767a6f3d9c0aad3de754e0cc4df40bb3c4bf52e983c" Dec 02 20:12:55 crc kubenswrapper[4807]: I1202 20:12:55.261300 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:55 crc kubenswrapper[4807]: I1202 20:12:55.357278 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d46k\" (UniqueName: \"kubernetes.io/projected/d40a0534-b58e-4a62-b3a1-5415909d89ca-kube-api-access-4d46k\") pod \"d40a0534-b58e-4a62-b3a1-5415909d89ca\" (UID: \"d40a0534-b58e-4a62-b3a1-5415909d89ca\") " Dec 02 20:12:55 crc kubenswrapper[4807]: I1202 20:12:55.357381 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d40a0534-b58e-4a62-b3a1-5415909d89ca-catalog-content\") pod \"d40a0534-b58e-4a62-b3a1-5415909d89ca\" (UID: \"d40a0534-b58e-4a62-b3a1-5415909d89ca\") " Dec 02 20:12:55 crc kubenswrapper[4807]: I1202 20:12:55.357470 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d40a0534-b58e-4a62-b3a1-5415909d89ca-utilities\") pod \"d40a0534-b58e-4a62-b3a1-5415909d89ca\" (UID: \"d40a0534-b58e-4a62-b3a1-5415909d89ca\") " Dec 02 20:12:55 crc kubenswrapper[4807]: I1202 20:12:55.360034 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d40a0534-b58e-4a62-b3a1-5415909d89ca-utilities" (OuterVolumeSpecName: "utilities") pod "d40a0534-b58e-4a62-b3a1-5415909d89ca" (UID: "d40a0534-b58e-4a62-b3a1-5415909d89ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:12:55 crc kubenswrapper[4807]: I1202 20:12:55.364357 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d40a0534-b58e-4a62-b3a1-5415909d89ca-kube-api-access-4d46k" (OuterVolumeSpecName: "kube-api-access-4d46k") pod "d40a0534-b58e-4a62-b3a1-5415909d89ca" (UID: "d40a0534-b58e-4a62-b3a1-5415909d89ca"). InnerVolumeSpecName "kube-api-access-4d46k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:12:55 crc kubenswrapper[4807]: I1202 20:12:55.394255 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d40a0534-b58e-4a62-b3a1-5415909d89ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d40a0534-b58e-4a62-b3a1-5415909d89ca" (UID: "d40a0534-b58e-4a62-b3a1-5415909d89ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:12:55 crc kubenswrapper[4807]: I1202 20:12:55.459184 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d46k\" (UniqueName: \"kubernetes.io/projected/d40a0534-b58e-4a62-b3a1-5415909d89ca-kube-api-access-4d46k\") on node \"crc\" DevicePath \"\"" Dec 02 20:12:55 crc kubenswrapper[4807]: I1202 20:12:55.459232 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d40a0534-b58e-4a62-b3a1-5415909d89ca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:12:55 crc kubenswrapper[4807]: I1202 20:12:55.459246 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d40a0534-b58e-4a62-b3a1-5415909d89ca-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:12:56 crc kubenswrapper[4807]: I1202 20:12:56.259001 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dtq86" Dec 02 20:12:56 crc kubenswrapper[4807]: I1202 20:12:56.293927 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dtq86"] Dec 02 20:12:56 crc kubenswrapper[4807]: I1202 20:12:56.298507 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dtq86"] Dec 02 20:12:56 crc kubenswrapper[4807]: I1202 20:12:56.982621 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d40a0534-b58e-4a62-b3a1-5415909d89ca" path="/var/lib/kubelet/pods/d40a0534-b58e-4a62-b3a1-5415909d89ca/volumes" Dec 02 20:12:57 crc kubenswrapper[4807]: I1202 20:12:57.459684 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rhm5n"] Dec 02 20:12:57 crc kubenswrapper[4807]: I1202 20:12:57.460570 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rhm5n" podUID="2d05ff98-e881-4a16-a902-09c3afd1bf61" containerName="registry-server" containerID="cri-o://de6603ad0cf0734be8c7b0d9603e06c72507daf064663e4a018b5e3c90839cd9" gracePeriod=2 Dec 02 20:12:58 crc kubenswrapper[4807]: I1202 20:12:58.949386 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.009874 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr4p8\" (UniqueName: \"kubernetes.io/projected/2d05ff98-e881-4a16-a902-09c3afd1bf61-kube-api-access-vr4p8\") pod \"2d05ff98-e881-4a16-a902-09c3afd1bf61\" (UID: \"2d05ff98-e881-4a16-a902-09c3afd1bf61\") " Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.010127 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d05ff98-e881-4a16-a902-09c3afd1bf61-utilities\") pod \"2d05ff98-e881-4a16-a902-09c3afd1bf61\" (UID: \"2d05ff98-e881-4a16-a902-09c3afd1bf61\") " Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.010176 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d05ff98-e881-4a16-a902-09c3afd1bf61-catalog-content\") pod \"2d05ff98-e881-4a16-a902-09c3afd1bf61\" (UID: \"2d05ff98-e881-4a16-a902-09c3afd1bf61\") " Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.011269 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d05ff98-e881-4a16-a902-09c3afd1bf61-utilities" (OuterVolumeSpecName: "utilities") pod "2d05ff98-e881-4a16-a902-09c3afd1bf61" (UID: "2d05ff98-e881-4a16-a902-09c3afd1bf61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.016484 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d05ff98-e881-4a16-a902-09c3afd1bf61-kube-api-access-vr4p8" (OuterVolumeSpecName: "kube-api-access-vr4p8") pod "2d05ff98-e881-4a16-a902-09c3afd1bf61" (UID: "2d05ff98-e881-4a16-a902-09c3afd1bf61"). InnerVolumeSpecName "kube-api-access-vr4p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.067437 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d05ff98-e881-4a16-a902-09c3afd1bf61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d05ff98-e881-4a16-a902-09c3afd1bf61" (UID: "2d05ff98-e881-4a16-a902-09c3afd1bf61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.111834 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr4p8\" (UniqueName: \"kubernetes.io/projected/2d05ff98-e881-4a16-a902-09c3afd1bf61-kube-api-access-vr4p8\") on node \"crc\" DevicePath \"\"" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.111870 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d05ff98-e881-4a16-a902-09c3afd1bf61-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.111879 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d05ff98-e881-4a16-a902-09c3afd1bf61-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.281679 4807 generic.go:334] "Generic (PLEG): container finished" podID="2d05ff98-e881-4a16-a902-09c3afd1bf61" containerID="de6603ad0cf0734be8c7b0d9603e06c72507daf064663e4a018b5e3c90839cd9" exitCode=0 Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.281756 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhm5n" event={"ID":"2d05ff98-e881-4a16-a902-09c3afd1bf61","Type":"ContainerDied","Data":"de6603ad0cf0734be8c7b0d9603e06c72507daf064663e4a018b5e3c90839cd9"} Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.281791 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rhm5n" event={"ID":"2d05ff98-e881-4a16-a902-09c3afd1bf61","Type":"ContainerDied","Data":"a38ffa953c3b3054a5230188dd583fd694ea18b33321257c3dfb8feb67510cd1"} Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.281811 4807 scope.go:117] "RemoveContainer" containerID="de6603ad0cf0734be8c7b0d9603e06c72507daf064663e4a018b5e3c90839cd9" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.281848 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rhm5n" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.310526 4807 scope.go:117] "RemoveContainer" containerID="ccb1810ab4c6db598714ab64bdc7c556a6a0d5951e996a79017eec839b764c96" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.321407 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rhm5n"] Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.326185 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rhm5n"] Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.339518 4807 scope.go:117] "RemoveContainer" containerID="11ea49b7210901458171b75eb85c5ab74f5d79228ab2c27953088350fd9649cf" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.357294 4807 scope.go:117] "RemoveContainer" containerID="de6603ad0cf0734be8c7b0d9603e06c72507daf064663e4a018b5e3c90839cd9" Dec 02 20:12:59 crc kubenswrapper[4807]: E1202 20:12:59.361471 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de6603ad0cf0734be8c7b0d9603e06c72507daf064663e4a018b5e3c90839cd9\": container with ID starting with de6603ad0cf0734be8c7b0d9603e06c72507daf064663e4a018b5e3c90839cd9 not found: ID does not exist" containerID="de6603ad0cf0734be8c7b0d9603e06c72507daf064663e4a018b5e3c90839cd9" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.362253 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de6603ad0cf0734be8c7b0d9603e06c72507daf064663e4a018b5e3c90839cd9"} err="failed to get container status \"de6603ad0cf0734be8c7b0d9603e06c72507daf064663e4a018b5e3c90839cd9\": rpc error: code = NotFound desc = could not find container \"de6603ad0cf0734be8c7b0d9603e06c72507daf064663e4a018b5e3c90839cd9\": container with ID starting with de6603ad0cf0734be8c7b0d9603e06c72507daf064663e4a018b5e3c90839cd9 not found: ID does not exist" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.362396 4807 scope.go:117] "RemoveContainer" containerID="ccb1810ab4c6db598714ab64bdc7c556a6a0d5951e996a79017eec839b764c96" Dec 02 20:12:59 crc kubenswrapper[4807]: E1202 20:12:59.363091 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccb1810ab4c6db598714ab64bdc7c556a6a0d5951e996a79017eec839b764c96\": container with ID starting with ccb1810ab4c6db598714ab64bdc7c556a6a0d5951e996a79017eec839b764c96 not found: ID does not exist" containerID="ccb1810ab4c6db598714ab64bdc7c556a6a0d5951e996a79017eec839b764c96" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.363153 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb1810ab4c6db598714ab64bdc7c556a6a0d5951e996a79017eec839b764c96"} err="failed to get container status \"ccb1810ab4c6db598714ab64bdc7c556a6a0d5951e996a79017eec839b764c96\": rpc error: code = NotFound desc = could not find container \"ccb1810ab4c6db598714ab64bdc7c556a6a0d5951e996a79017eec839b764c96\": container with ID starting with ccb1810ab4c6db598714ab64bdc7c556a6a0d5951e996a79017eec839b764c96 not found: ID does not exist" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.363200 4807 scope.go:117] "RemoveContainer" containerID="11ea49b7210901458171b75eb85c5ab74f5d79228ab2c27953088350fd9649cf" Dec 02 20:12:59 crc kubenswrapper[4807]: E1202 20:12:59.363612 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11ea49b7210901458171b75eb85c5ab74f5d79228ab2c27953088350fd9649cf\": container with ID starting with 11ea49b7210901458171b75eb85c5ab74f5d79228ab2c27953088350fd9649cf not found: ID does not exist" containerID="11ea49b7210901458171b75eb85c5ab74f5d79228ab2c27953088350fd9649cf" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.363647 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ea49b7210901458171b75eb85c5ab74f5d79228ab2c27953088350fd9649cf"} err="failed to get container status \"11ea49b7210901458171b75eb85c5ab74f5d79228ab2c27953088350fd9649cf\": rpc error: code = NotFound desc = could not find container \"11ea49b7210901458171b75eb85c5ab74f5d79228ab2c27953088350fd9649cf\": container with ID starting with 11ea49b7210901458171b75eb85c5ab74f5d79228ab2c27953088350fd9649cf not found: ID does not exist" Dec 02 20:12:59 crc kubenswrapper[4807]: I1202 20:12:59.764343 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-glnxl" Dec 02 20:13:00 crc kubenswrapper[4807]: I1202 20:13:00.979395 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d05ff98-e881-4a16-a902-09c3afd1bf61" path="/var/lib/kubelet/pods/2d05ff98-e881-4a16-a902-09c3afd1bf61/volumes" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.664343 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-knpv4"] Dec 02 20:13:04 crc kubenswrapper[4807]: E1202 20:13:04.665069 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d05ff98-e881-4a16-a902-09c3afd1bf61" containerName="extract-utilities" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.665084 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d05ff98-e881-4a16-a902-09c3afd1bf61" containerName="extract-utilities" Dec 02 20:13:04 crc kubenswrapper[4807]: E1202 20:13:04.665095 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d05ff98-e881-4a16-a902-09c3afd1bf61" containerName="extract-content" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.665102 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d05ff98-e881-4a16-a902-09c3afd1bf61" containerName="extract-content" Dec 02 20:13:04 crc kubenswrapper[4807]: E1202 20:13:04.665109 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40a0534-b58e-4a62-b3a1-5415909d89ca" containerName="registry-server" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.665116 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40a0534-b58e-4a62-b3a1-5415909d89ca" containerName="registry-server" Dec 02 20:13:04 crc kubenswrapper[4807]: E1202 20:13:04.665126 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40a0534-b58e-4a62-b3a1-5415909d89ca" containerName="extract-utilities" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.665132 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40a0534-b58e-4a62-b3a1-5415909d89ca" containerName="extract-utilities" Dec 02 20:13:04 crc kubenswrapper[4807]: E1202 20:13:04.665142 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d05ff98-e881-4a16-a902-09c3afd1bf61" containerName="registry-server" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.665148 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d05ff98-e881-4a16-a902-09c3afd1bf61" containerName="registry-server" Dec 02 20:13:04 crc kubenswrapper[4807]: E1202 20:13:04.665159 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40a0534-b58e-4a62-b3a1-5415909d89ca" containerName="extract-content" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.665167 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40a0534-b58e-4a62-b3a1-5415909d89ca" containerName="extract-content" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.665298 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d40a0534-b58e-4a62-b3a1-5415909d89ca" containerName="registry-server" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.665314 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d05ff98-e881-4a16-a902-09c3afd1bf61" containerName="registry-server" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.666284 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.681773 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-knpv4"] Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.793645 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gd28\" (UniqueName: \"kubernetes.io/projected/e2803dba-db09-4cf8-8e9c-a06b1057358a-kube-api-access-8gd28\") pod \"community-operators-knpv4\" (UID: \"e2803dba-db09-4cf8-8e9c-a06b1057358a\") " pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.793800 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2803dba-db09-4cf8-8e9c-a06b1057358a-catalog-content\") pod \"community-operators-knpv4\" (UID: \"e2803dba-db09-4cf8-8e9c-a06b1057358a\") " pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.793861 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2803dba-db09-4cf8-8e9c-a06b1057358a-utilities\") pod \"community-operators-knpv4\" (UID: \"e2803dba-db09-4cf8-8e9c-a06b1057358a\") " pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.895490 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gd28\" (UniqueName: \"kubernetes.io/projected/e2803dba-db09-4cf8-8e9c-a06b1057358a-kube-api-access-8gd28\") pod \"community-operators-knpv4\" (UID: \"e2803dba-db09-4cf8-8e9c-a06b1057358a\") " pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.896018 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2803dba-db09-4cf8-8e9c-a06b1057358a-catalog-content\") pod \"community-operators-knpv4\" (UID: \"e2803dba-db09-4cf8-8e9c-a06b1057358a\") " pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.896150 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2803dba-db09-4cf8-8e9c-a06b1057358a-utilities\") pod \"community-operators-knpv4\" (UID: \"e2803dba-db09-4cf8-8e9c-a06b1057358a\") " pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.896647 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2803dba-db09-4cf8-8e9c-a06b1057358a-utilities\") pod \"community-operators-knpv4\" (UID: \"e2803dba-db09-4cf8-8e9c-a06b1057358a\") " pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.896864 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2803dba-db09-4cf8-8e9c-a06b1057358a-catalog-content\") pod \"community-operators-knpv4\" (UID: \"e2803dba-db09-4cf8-8e9c-a06b1057358a\") " pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.919054 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gd28\" (UniqueName: \"kubernetes.io/projected/e2803dba-db09-4cf8-8e9c-a06b1057358a-kube-api-access-8gd28\") pod \"community-operators-knpv4\" (UID: \"e2803dba-db09-4cf8-8e9c-a06b1057358a\") " pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:04 crc kubenswrapper[4807]: I1202 20:13:04.984635 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:05 crc kubenswrapper[4807]: I1202 20:13:05.362267 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-knpv4"] Dec 02 20:13:05 crc kubenswrapper[4807]: W1202 20:13:05.370774 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2803dba_db09_4cf8_8e9c_a06b1057358a.slice/crio-03228a662c794ad4c4661b37c147c1865cf401fcc0b22c43a115ec6219a5efe9 WatchSource:0}: Error finding container 03228a662c794ad4c4661b37c147c1865cf401fcc0b22c43a115ec6219a5efe9: Status 404 returned error can't find the container with id 03228a662c794ad4c4661b37c147c1865cf401fcc0b22c43a115ec6219a5efe9 Dec 02 20:13:08 crc kubenswrapper[4807]: I1202 20:13:06.322253 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knpv4" event={"ID":"e2803dba-db09-4cf8-8e9c-a06b1057358a","Type":"ContainerStarted","Data":"03228a662c794ad4c4661b37c147c1865cf401fcc0b22c43a115ec6219a5efe9"} Dec 02 20:13:08 crc kubenswrapper[4807]: I1202 20:13:08.337108 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knpv4" event={"ID":"e2803dba-db09-4cf8-8e9c-a06b1057358a","Type":"ContainerStarted","Data":"0235a4ba6732d73c13b16e660ab2fa2a33dee530321137761b529556803652dd"} Dec 02 20:13:09 crc kubenswrapper[4807]: I1202 20:13:09.344159 4807 generic.go:334] "Generic (PLEG): container finished" podID="e2803dba-db09-4cf8-8e9c-a06b1057358a" containerID="0235a4ba6732d73c13b16e660ab2fa2a33dee530321137761b529556803652dd" exitCode=0 Dec 02 20:13:09 crc kubenswrapper[4807]: I1202 20:13:09.344555 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knpv4" event={"ID":"e2803dba-db09-4cf8-8e9c-a06b1057358a","Type":"ContainerDied","Data":"0235a4ba6732d73c13b16e660ab2fa2a33dee530321137761b529556803652dd"} Dec 02 20:13:11 crc kubenswrapper[4807]: I1202 20:13:11.359155 4807 generic.go:334] "Generic (PLEG): container finished" podID="e2803dba-db09-4cf8-8e9c-a06b1057358a" containerID="75eeedbf5a404d43ddeb002c21c2471001af7b0090d189e1f65c1d749d093410" exitCode=0 Dec 02 20:13:11 crc kubenswrapper[4807]: I1202 20:13:11.359275 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knpv4" event={"ID":"e2803dba-db09-4cf8-8e9c-a06b1057358a","Type":"ContainerDied","Data":"75eeedbf5a404d43ddeb002c21c2471001af7b0090d189e1f65c1d749d093410"} Dec 02 20:13:12 crc kubenswrapper[4807]: I1202 20:13:12.373927 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knpv4" event={"ID":"e2803dba-db09-4cf8-8e9c-a06b1057358a","Type":"ContainerStarted","Data":"b202751bf10ed3216710b4ab871c654fa1af37085e99a03b172da35857965fbe"} Dec 02 20:13:12 crc kubenswrapper[4807]: I1202 20:13:12.402263 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-knpv4" podStartSLOduration=5.5797971969999995 podStartE2EDuration="8.402239388s" podCreationTimestamp="2025-12-02 20:13:04 +0000 UTC" firstStartedPulling="2025-12-02 20:13:09.346308393 +0000 UTC m=+924.647215888" lastFinishedPulling="2025-12-02 20:13:12.168750584 +0000 UTC m=+927.469658079" observedRunningTime="2025-12-02 20:13:12.399332395 +0000 UTC m=+927.700239900" watchObservedRunningTime="2025-12-02 20:13:12.402239388 +0000 UTC m=+927.703146883" Dec 02 20:13:14 crc kubenswrapper[4807]: I1202 20:13:14.984837 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:14 crc kubenswrapper[4807]: I1202 20:13:14.984903 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:15 crc kubenswrapper[4807]: I1202 20:13:15.051831 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:18 crc kubenswrapper[4807]: I1202 20:13:18.924076 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx"] Dec 02 20:13:18 crc kubenswrapper[4807]: I1202 20:13:18.925493 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" Dec 02 20:13:18 crc kubenswrapper[4807]: I1202 20:13:18.928069 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 20:13:18 crc kubenswrapper[4807]: I1202 20:13:18.946733 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx"] Dec 02 20:13:19 crc kubenswrapper[4807]: I1202 20:13:19.006628 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/718520aa-df66-40e9-a10a-ac83475f1997-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx\" (UID: \"718520aa-df66-40e9-a10a-ac83475f1997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" Dec 02 20:13:19 crc kubenswrapper[4807]: I1202 20:13:19.006678 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bpm6\" (UniqueName: \"kubernetes.io/projected/718520aa-df66-40e9-a10a-ac83475f1997-kube-api-access-2bpm6\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx\" (UID: \"718520aa-df66-40e9-a10a-ac83475f1997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" Dec 02 20:13:19 crc kubenswrapper[4807]: I1202 20:13:19.006708 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/718520aa-df66-40e9-a10a-ac83475f1997-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx\" (UID: \"718520aa-df66-40e9-a10a-ac83475f1997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" Dec 02 20:13:19 crc kubenswrapper[4807]: I1202 20:13:19.107973 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/718520aa-df66-40e9-a10a-ac83475f1997-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx\" (UID: \"718520aa-df66-40e9-a10a-ac83475f1997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" Dec 02 20:13:19 crc kubenswrapper[4807]: I1202 20:13:19.108020 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bpm6\" (UniqueName: \"kubernetes.io/projected/718520aa-df66-40e9-a10a-ac83475f1997-kube-api-access-2bpm6\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx\" (UID: \"718520aa-df66-40e9-a10a-ac83475f1997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" Dec 02 20:13:19 crc kubenswrapper[4807]: I1202 20:13:19.108045 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/718520aa-df66-40e9-a10a-ac83475f1997-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx\" (UID: \"718520aa-df66-40e9-a10a-ac83475f1997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" Dec 02 20:13:19 crc kubenswrapper[4807]: I1202 20:13:19.108579 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/718520aa-df66-40e9-a10a-ac83475f1997-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx\" (UID: \"718520aa-df66-40e9-a10a-ac83475f1997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" Dec 02 20:13:19 crc kubenswrapper[4807]: I1202 20:13:19.108806 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/718520aa-df66-40e9-a10a-ac83475f1997-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx\" (UID: \"718520aa-df66-40e9-a10a-ac83475f1997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" Dec 02 20:13:19 crc kubenswrapper[4807]: I1202 20:13:19.130049 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bpm6\" (UniqueName: \"kubernetes.io/projected/718520aa-df66-40e9-a10a-ac83475f1997-kube-api-access-2bpm6\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx\" (UID: \"718520aa-df66-40e9-a10a-ac83475f1997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" Dec 02 20:13:19 crc kubenswrapper[4807]: I1202 20:13:19.245012 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" Dec 02 20:13:19 crc kubenswrapper[4807]: I1202 20:13:19.476366 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx"] Dec 02 20:13:20 crc kubenswrapper[4807]: I1202 20:13:20.424937 4807 generic.go:334] "Generic (PLEG): container finished" podID="718520aa-df66-40e9-a10a-ac83475f1997" containerID="30a7817490817e24539deb11debe419bab063a61530bc99495f94c501787db3c" exitCode=0 Dec 02 20:13:20 crc kubenswrapper[4807]: I1202 20:13:20.425044 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" event={"ID":"718520aa-df66-40e9-a10a-ac83475f1997","Type":"ContainerDied","Data":"30a7817490817e24539deb11debe419bab063a61530bc99495f94c501787db3c"} Dec 02 20:13:20 crc kubenswrapper[4807]: I1202 20:13:20.426971 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" event={"ID":"718520aa-df66-40e9-a10a-ac83475f1997","Type":"ContainerStarted","Data":"bfdf883c88745bc724fdd3494173398bed729d2e0e4693bce48a8cf24c88360c"} Dec 02 20:13:23 crc kubenswrapper[4807]: I1202 20:13:23.449315 4807 generic.go:334] "Generic (PLEG): container finished" podID="718520aa-df66-40e9-a10a-ac83475f1997" containerID="5a365e89c4e2d068e1996ca3b02f8e11257ef2a4f1e5715a207272a6b76a6fb4" exitCode=0 Dec 02 20:13:23 crc kubenswrapper[4807]: I1202 20:13:23.449521 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" event={"ID":"718520aa-df66-40e9-a10a-ac83475f1997","Type":"ContainerDied","Data":"5a365e89c4e2d068e1996ca3b02f8e11257ef2a4f1e5715a207272a6b76a6fb4"} Dec 02 20:13:24 crc kubenswrapper[4807]: I1202 20:13:24.460079 4807 generic.go:334] "Generic (PLEG): container finished" podID="718520aa-df66-40e9-a10a-ac83475f1997" containerID="abf379bc4e25f542058d16ace861a66f9df57156f5ffca2bcbefe457d95f7403" exitCode=0 Dec 02 20:13:24 crc kubenswrapper[4807]: I1202 20:13:24.460155 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" event={"ID":"718520aa-df66-40e9-a10a-ac83475f1997","Type":"ContainerDied","Data":"abf379bc4e25f542058d16ace861a66f9df57156f5ffca2bcbefe457d95f7403"} Dec 02 20:13:25 crc kubenswrapper[4807]: I1202 20:13:25.042884 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:25 crc kubenswrapper[4807]: I1202 20:13:25.732896 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" Dec 02 20:13:25 crc kubenswrapper[4807]: I1202 20:13:25.808835 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/718520aa-df66-40e9-a10a-ac83475f1997-util\") pod \"718520aa-df66-40e9-a10a-ac83475f1997\" (UID: \"718520aa-df66-40e9-a10a-ac83475f1997\") " Dec 02 20:13:25 crc kubenswrapper[4807]: I1202 20:13:25.808934 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bpm6\" (UniqueName: \"kubernetes.io/projected/718520aa-df66-40e9-a10a-ac83475f1997-kube-api-access-2bpm6\") pod \"718520aa-df66-40e9-a10a-ac83475f1997\" (UID: \"718520aa-df66-40e9-a10a-ac83475f1997\") " Dec 02 20:13:25 crc kubenswrapper[4807]: I1202 20:13:25.809079 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/718520aa-df66-40e9-a10a-ac83475f1997-bundle\") pod \"718520aa-df66-40e9-a10a-ac83475f1997\" (UID: \"718520aa-df66-40e9-a10a-ac83475f1997\") " Dec 02 20:13:25 crc kubenswrapper[4807]: I1202 20:13:25.809954 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/718520aa-df66-40e9-a10a-ac83475f1997-bundle" (OuterVolumeSpecName: "bundle") pod "718520aa-df66-40e9-a10a-ac83475f1997" (UID: "718520aa-df66-40e9-a10a-ac83475f1997"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:13:25 crc kubenswrapper[4807]: I1202 20:13:25.816892 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/718520aa-df66-40e9-a10a-ac83475f1997-kube-api-access-2bpm6" (OuterVolumeSpecName: "kube-api-access-2bpm6") pod "718520aa-df66-40e9-a10a-ac83475f1997" (UID: "718520aa-df66-40e9-a10a-ac83475f1997"). InnerVolumeSpecName "kube-api-access-2bpm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:13:25 crc kubenswrapper[4807]: I1202 20:13:25.821661 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/718520aa-df66-40e9-a10a-ac83475f1997-util" (OuterVolumeSpecName: "util") pod "718520aa-df66-40e9-a10a-ac83475f1997" (UID: "718520aa-df66-40e9-a10a-ac83475f1997"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:13:25 crc kubenswrapper[4807]: I1202 20:13:25.911323 4807 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/718520aa-df66-40e9-a10a-ac83475f1997-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:13:25 crc kubenswrapper[4807]: I1202 20:13:25.911379 4807 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/718520aa-df66-40e9-a10a-ac83475f1997-util\") on node \"crc\" DevicePath \"\"" Dec 02 20:13:25 crc kubenswrapper[4807]: I1202 20:13:25.911399 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bpm6\" (UniqueName: \"kubernetes.io/projected/718520aa-df66-40e9-a10a-ac83475f1997-kube-api-access-2bpm6\") on node \"crc\" DevicePath \"\"" Dec 02 20:13:26 crc kubenswrapper[4807]: I1202 20:13:26.477050 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" event={"ID":"718520aa-df66-40e9-a10a-ac83475f1997","Type":"ContainerDied","Data":"bfdf883c88745bc724fdd3494173398bed729d2e0e4693bce48a8cf24c88360c"} Dec 02 20:13:26 crc kubenswrapper[4807]: I1202 20:13:26.477120 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfdf883c88745bc724fdd3494173398bed729d2e0e4693bce48a8cf24c88360c" Dec 02 20:13:26 crc kubenswrapper[4807]: I1202 20:13:26.477228 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx" Dec 02 20:13:28 crc kubenswrapper[4807]: I1202 20:13:28.292564 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:13:28 crc kubenswrapper[4807]: I1202 20:13:28.292654 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:13:29 crc kubenswrapper[4807]: I1202 20:13:29.255395 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-knpv4"] Dec 02 20:13:29 crc kubenswrapper[4807]: I1202 20:13:29.255775 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-knpv4" podUID="e2803dba-db09-4cf8-8e9c-a06b1057358a" containerName="registry-server" containerID="cri-o://b202751bf10ed3216710b4ab871c654fa1af37085e99a03b172da35857965fbe" gracePeriod=2 Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.227938 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.411676 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-tqh42"] Dec 02 20:13:30 crc kubenswrapper[4807]: E1202 20:13:30.411999 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718520aa-df66-40e9-a10a-ac83475f1997" containerName="extract" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.412020 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="718520aa-df66-40e9-a10a-ac83475f1997" containerName="extract" Dec 02 20:13:30 crc kubenswrapper[4807]: E1202 20:13:30.412037 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2803dba-db09-4cf8-8e9c-a06b1057358a" containerName="extract-content" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.412046 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2803dba-db09-4cf8-8e9c-a06b1057358a" containerName="extract-content" Dec 02 20:13:30 crc kubenswrapper[4807]: E1202 20:13:30.412056 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718520aa-df66-40e9-a10a-ac83475f1997" containerName="util" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.412064 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="718520aa-df66-40e9-a10a-ac83475f1997" containerName="util" Dec 02 20:13:30 crc kubenswrapper[4807]: E1202 20:13:30.412074 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2803dba-db09-4cf8-8e9c-a06b1057358a" containerName="registry-server" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.412082 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2803dba-db09-4cf8-8e9c-a06b1057358a" containerName="registry-server" Dec 02 20:13:30 crc kubenswrapper[4807]: E1202 20:13:30.412095 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2803dba-db09-4cf8-8e9c-a06b1057358a" containerName="extract-utilities" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.412104 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2803dba-db09-4cf8-8e9c-a06b1057358a" containerName="extract-utilities" Dec 02 20:13:30 crc kubenswrapper[4807]: E1202 20:13:30.412118 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718520aa-df66-40e9-a10a-ac83475f1997" containerName="pull" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.412126 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="718520aa-df66-40e9-a10a-ac83475f1997" containerName="pull" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.412257 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2803dba-db09-4cf8-8e9c-a06b1057358a" containerName="registry-server" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.412268 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="718520aa-df66-40e9-a10a-ac83475f1997" containerName="extract" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.412817 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-tqh42" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.415372 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.415555 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.417056 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-5zpmh" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.428036 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-tqh42"] Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.428708 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2803dba-db09-4cf8-8e9c-a06b1057358a-catalog-content\") pod \"e2803dba-db09-4cf8-8e9c-a06b1057358a\" (UID: \"e2803dba-db09-4cf8-8e9c-a06b1057358a\") " Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.428777 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2803dba-db09-4cf8-8e9c-a06b1057358a-utilities\") pod \"e2803dba-db09-4cf8-8e9c-a06b1057358a\" (UID: \"e2803dba-db09-4cf8-8e9c-a06b1057358a\") " Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.428855 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gd28\" (UniqueName: \"kubernetes.io/projected/e2803dba-db09-4cf8-8e9c-a06b1057358a-kube-api-access-8gd28\") pod \"e2803dba-db09-4cf8-8e9c-a06b1057358a\" (UID: \"e2803dba-db09-4cf8-8e9c-a06b1057358a\") " Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.429049 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwwnc\" (UniqueName: \"kubernetes.io/projected/6f69b73b-75f4-4c02-a252-efd2ea50b022-kube-api-access-cwwnc\") pod \"nmstate-operator-5b5b58f5c8-tqh42\" (UID: \"6f69b73b-75f4-4c02-a252-efd2ea50b022\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-tqh42" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.434239 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2803dba-db09-4cf8-8e9c-a06b1057358a-utilities" (OuterVolumeSpecName: "utilities") pod "e2803dba-db09-4cf8-8e9c-a06b1057358a" (UID: "e2803dba-db09-4cf8-8e9c-a06b1057358a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.447937 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2803dba-db09-4cf8-8e9c-a06b1057358a-kube-api-access-8gd28" (OuterVolumeSpecName: "kube-api-access-8gd28") pod "e2803dba-db09-4cf8-8e9c-a06b1057358a" (UID: "e2803dba-db09-4cf8-8e9c-a06b1057358a"). InnerVolumeSpecName "kube-api-access-8gd28". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.482516 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2803dba-db09-4cf8-8e9c-a06b1057358a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2803dba-db09-4cf8-8e9c-a06b1057358a" (UID: "e2803dba-db09-4cf8-8e9c-a06b1057358a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.505939 4807 generic.go:334] "Generic (PLEG): container finished" podID="e2803dba-db09-4cf8-8e9c-a06b1057358a" containerID="b202751bf10ed3216710b4ab871c654fa1af37085e99a03b172da35857965fbe" exitCode=0 Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.505986 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knpv4" event={"ID":"e2803dba-db09-4cf8-8e9c-a06b1057358a","Type":"ContainerDied","Data":"b202751bf10ed3216710b4ab871c654fa1af37085e99a03b172da35857965fbe"} Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.506021 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knpv4" event={"ID":"e2803dba-db09-4cf8-8e9c-a06b1057358a","Type":"ContainerDied","Data":"03228a662c794ad4c4661b37c147c1865cf401fcc0b22c43a115ec6219a5efe9"} Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.506038 4807 scope.go:117] "RemoveContainer" containerID="b202751bf10ed3216710b4ab871c654fa1af37085e99a03b172da35857965fbe" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.506066 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knpv4" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.523044 4807 scope.go:117] "RemoveContainer" containerID="75eeedbf5a404d43ddeb002c21c2471001af7b0090d189e1f65c1d749d093410" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.529758 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwwnc\" (UniqueName: \"kubernetes.io/projected/6f69b73b-75f4-4c02-a252-efd2ea50b022-kube-api-access-cwwnc\") pod \"nmstate-operator-5b5b58f5c8-tqh42\" (UID: \"6f69b73b-75f4-4c02-a252-efd2ea50b022\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-tqh42" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.529887 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2803dba-db09-4cf8-8e9c-a06b1057358a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.529901 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2803dba-db09-4cf8-8e9c-a06b1057358a-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.529912 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gd28\" (UniqueName: \"kubernetes.io/projected/e2803dba-db09-4cf8-8e9c-a06b1057358a-kube-api-access-8gd28\") on node \"crc\" DevicePath \"\"" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.549032 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-knpv4"] Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.550230 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-knpv4"] Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.555887 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwwnc\" (UniqueName: \"kubernetes.io/projected/6f69b73b-75f4-4c02-a252-efd2ea50b022-kube-api-access-cwwnc\") pod \"nmstate-operator-5b5b58f5c8-tqh42\" (UID: \"6f69b73b-75f4-4c02-a252-efd2ea50b022\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-tqh42" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.559120 4807 scope.go:117] "RemoveContainer" containerID="0235a4ba6732d73c13b16e660ab2fa2a33dee530321137761b529556803652dd" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.584117 4807 scope.go:117] "RemoveContainer" containerID="b202751bf10ed3216710b4ab871c654fa1af37085e99a03b172da35857965fbe" Dec 02 20:13:30 crc kubenswrapper[4807]: E1202 20:13:30.584703 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b202751bf10ed3216710b4ab871c654fa1af37085e99a03b172da35857965fbe\": container with ID starting with b202751bf10ed3216710b4ab871c654fa1af37085e99a03b172da35857965fbe not found: ID does not exist" containerID="b202751bf10ed3216710b4ab871c654fa1af37085e99a03b172da35857965fbe" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.584757 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b202751bf10ed3216710b4ab871c654fa1af37085e99a03b172da35857965fbe"} err="failed to get container status \"b202751bf10ed3216710b4ab871c654fa1af37085e99a03b172da35857965fbe\": rpc error: code = NotFound desc = could not find container \"b202751bf10ed3216710b4ab871c654fa1af37085e99a03b172da35857965fbe\": container with ID starting with b202751bf10ed3216710b4ab871c654fa1af37085e99a03b172da35857965fbe not found: ID does not exist" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.584785 4807 scope.go:117] "RemoveContainer" containerID="75eeedbf5a404d43ddeb002c21c2471001af7b0090d189e1f65c1d749d093410" Dec 02 20:13:30 crc kubenswrapper[4807]: E1202 20:13:30.585031 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75eeedbf5a404d43ddeb002c21c2471001af7b0090d189e1f65c1d749d093410\": container with ID starting with 75eeedbf5a404d43ddeb002c21c2471001af7b0090d189e1f65c1d749d093410 not found: ID does not exist" containerID="75eeedbf5a404d43ddeb002c21c2471001af7b0090d189e1f65c1d749d093410" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.585061 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75eeedbf5a404d43ddeb002c21c2471001af7b0090d189e1f65c1d749d093410"} err="failed to get container status \"75eeedbf5a404d43ddeb002c21c2471001af7b0090d189e1f65c1d749d093410\": rpc error: code = NotFound desc = could not find container \"75eeedbf5a404d43ddeb002c21c2471001af7b0090d189e1f65c1d749d093410\": container with ID starting with 75eeedbf5a404d43ddeb002c21c2471001af7b0090d189e1f65c1d749d093410 not found: ID does not exist" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.585079 4807 scope.go:117] "RemoveContainer" containerID="0235a4ba6732d73c13b16e660ab2fa2a33dee530321137761b529556803652dd" Dec 02 20:13:30 crc kubenswrapper[4807]: E1202 20:13:30.585336 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0235a4ba6732d73c13b16e660ab2fa2a33dee530321137761b529556803652dd\": container with ID starting with 0235a4ba6732d73c13b16e660ab2fa2a33dee530321137761b529556803652dd not found: ID does not exist" containerID="0235a4ba6732d73c13b16e660ab2fa2a33dee530321137761b529556803652dd" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.585365 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0235a4ba6732d73c13b16e660ab2fa2a33dee530321137761b529556803652dd"} err="failed to get container status \"0235a4ba6732d73c13b16e660ab2fa2a33dee530321137761b529556803652dd\": rpc error: code = NotFound desc = could not find container \"0235a4ba6732d73c13b16e660ab2fa2a33dee530321137761b529556803652dd\": container with ID starting with 0235a4ba6732d73c13b16e660ab2fa2a33dee530321137761b529556803652dd not found: ID does not exist" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.732794 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-tqh42" Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.951849 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-tqh42"] Dec 02 20:13:30 crc kubenswrapper[4807]: I1202 20:13:30.980451 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2803dba-db09-4cf8-8e9c-a06b1057358a" path="/var/lib/kubelet/pods/e2803dba-db09-4cf8-8e9c-a06b1057358a/volumes" Dec 02 20:13:31 crc kubenswrapper[4807]: I1202 20:13:31.516132 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-tqh42" event={"ID":"6f69b73b-75f4-4c02-a252-efd2ea50b022","Type":"ContainerStarted","Data":"fbc3698ef83e4bd651ac21b106e3f224c2ad2c7ef5920feb10ab2ca795136ed1"} Dec 02 20:13:33 crc kubenswrapper[4807]: I1202 20:13:33.527636 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-tqh42" event={"ID":"6f69b73b-75f4-4c02-a252-efd2ea50b022","Type":"ContainerStarted","Data":"2e619828c820d1f9dd7cbc85634202815f18777bb5f112edb333265040118d18"} Dec 02 20:13:39 crc kubenswrapper[4807]: I1202 20:13:39.931487 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-tqh42" podStartSLOduration=7.8480800120000005 podStartE2EDuration="9.93146416s" podCreationTimestamp="2025-12-02 20:13:30 +0000 UTC" firstStartedPulling="2025-12-02 20:13:30.96255707 +0000 UTC m=+946.263464565" lastFinishedPulling="2025-12-02 20:13:33.045941218 +0000 UTC m=+948.346848713" observedRunningTime="2025-12-02 20:13:33.551053885 +0000 UTC m=+948.851961380" watchObservedRunningTime="2025-12-02 20:13:39.93146416 +0000 UTC m=+955.232371655" Dec 02 20:13:39 crc kubenswrapper[4807]: I1202 20:13:39.935539 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-6vnbp"] Dec 02 20:13:39 crc kubenswrapper[4807]: I1202 20:13:39.936708 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6vnbp" Dec 02 20:13:39 crc kubenswrapper[4807]: I1202 20:13:39.938482 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-74bfc" Dec 02 20:13:39 crc kubenswrapper[4807]: I1202 20:13:39.946526 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj"] Dec 02 20:13:39 crc kubenswrapper[4807]: I1202 20:13:39.947383 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj" Dec 02 20:13:39 crc kubenswrapper[4807]: I1202 20:13:39.949484 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 02 20:13:39 crc kubenswrapper[4807]: I1202 20:13:39.951690 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-6vnbp"] Dec 02 20:13:39 crc kubenswrapper[4807]: I1202 20:13:39.966700 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlxbt\" (UniqueName: \"kubernetes.io/projected/01df7c9d-768d-417f-a7ed-7865655d889d-kube-api-access-vlxbt\") pod \"nmstate-metrics-7f946cbc9-6vnbp\" (UID: \"01df7c9d-768d-417f-a7ed-7865655d889d\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6vnbp" Dec 02 20:13:39 crc kubenswrapper[4807]: I1202 20:13:39.966864 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cbjc\" (UniqueName: \"kubernetes.io/projected/4a32ad98-5354-49e6-957e-ad0828445a24-kube-api-access-4cbjc\") pod \"nmstate-webhook-5f6d4c5ccb-k8qmj\" (UID: \"4a32ad98-5354-49e6-957e-ad0828445a24\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj" Dec 02 20:13:39 crc kubenswrapper[4807]: I1202 20:13:39.966944 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4a32ad98-5354-49e6-957e-ad0828445a24-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-k8qmj\" (UID: \"4a32ad98-5354-49e6-957e-ad0828445a24\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj" Dec 02 20:13:39 crc kubenswrapper[4807]: I1202 20:13:39.971155 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-wd9cv"] Dec 02 20:13:39 crc kubenswrapper[4807]: I1202 20:13:39.972207 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wd9cv" Dec 02 20:13:39 crc kubenswrapper[4807]: I1202 20:13:39.985208 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj"] Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.067832 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlxbt\" (UniqueName: \"kubernetes.io/projected/01df7c9d-768d-417f-a7ed-7865655d889d-kube-api-access-vlxbt\") pod \"nmstate-metrics-7f946cbc9-6vnbp\" (UID: \"01df7c9d-768d-417f-a7ed-7865655d889d\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6vnbp" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.068204 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cbjc\" (UniqueName: \"kubernetes.io/projected/4a32ad98-5354-49e6-957e-ad0828445a24-kube-api-access-4cbjc\") pod \"nmstate-webhook-5f6d4c5ccb-k8qmj\" (UID: \"4a32ad98-5354-49e6-957e-ad0828445a24\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.068346 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4a32ad98-5354-49e6-957e-ad0828445a24-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-k8qmj\" (UID: \"4a32ad98-5354-49e6-957e-ad0828445a24\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj" Dec 02 20:13:40 crc kubenswrapper[4807]: E1202 20:13:40.068475 4807 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 02 20:13:40 crc kubenswrapper[4807]: E1202 20:13:40.068549 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a32ad98-5354-49e6-957e-ad0828445a24-tls-key-pair podName:4a32ad98-5354-49e6-957e-ad0828445a24 nodeName:}" failed. No retries permitted until 2025-12-02 20:13:40.568529733 +0000 UTC m=+955.869437228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/4a32ad98-5354-49e6-957e-ad0828445a24-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-k8qmj" (UID: "4a32ad98-5354-49e6-957e-ad0828445a24") : secret "openshift-nmstate-webhook" not found Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.091049 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp"] Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.094562 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlxbt\" (UniqueName: \"kubernetes.io/projected/01df7c9d-768d-417f-a7ed-7865655d889d-kube-api-access-vlxbt\") pod \"nmstate-metrics-7f946cbc9-6vnbp\" (UID: \"01df7c9d-768d-417f-a7ed-7865655d889d\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6vnbp" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.095084 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.098123 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-ws95q" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.098292 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.098756 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cbjc\" (UniqueName: \"kubernetes.io/projected/4a32ad98-5354-49e6-957e-ad0828445a24-kube-api-access-4cbjc\") pod \"nmstate-webhook-5f6d4c5ccb-k8qmj\" (UID: \"4a32ad98-5354-49e6-957e-ad0828445a24\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.098867 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.119123 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp"] Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.169645 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8b23693d-f1f9-4ae2-9558-44a4a25745bd-dbus-socket\") pod \"nmstate-handler-wd9cv\" (UID: \"8b23693d-f1f9-4ae2-9558-44a4a25745bd\") " pod="openshift-nmstate/nmstate-handler-wd9cv" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.169947 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8b23693d-f1f9-4ae2-9558-44a4a25745bd-ovs-socket\") pod \"nmstate-handler-wd9cv\" (UID: \"8b23693d-f1f9-4ae2-9558-44a4a25745bd\") " pod="openshift-nmstate/nmstate-handler-wd9cv" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.170016 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbfnd\" (UniqueName: \"kubernetes.io/projected/37bb14e1-531b-43cf-b232-c11257dcf690-kube-api-access-lbfnd\") pod \"nmstate-console-plugin-7fbb5f6569-skgcp\" (UID: \"37bb14e1-531b-43cf-b232-c11257dcf690\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.170145 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/37bb14e1-531b-43cf-b232-c11257dcf690-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-skgcp\" (UID: \"37bb14e1-531b-43cf-b232-c11257dcf690\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.170230 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8b23693d-f1f9-4ae2-9558-44a4a25745bd-nmstate-lock\") pod \"nmstate-handler-wd9cv\" (UID: \"8b23693d-f1f9-4ae2-9558-44a4a25745bd\") " pod="openshift-nmstate/nmstate-handler-wd9cv" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.170303 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/37bb14e1-531b-43cf-b232-c11257dcf690-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-skgcp\" (UID: \"37bb14e1-531b-43cf-b232-c11257dcf690\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.170372 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjc6t\" (UniqueName: \"kubernetes.io/projected/8b23693d-f1f9-4ae2-9558-44a4a25745bd-kube-api-access-rjc6t\") pod \"nmstate-handler-wd9cv\" (UID: \"8b23693d-f1f9-4ae2-9558-44a4a25745bd\") " pod="openshift-nmstate/nmstate-handler-wd9cv" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.262067 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6vnbp" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.271865 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8b23693d-f1f9-4ae2-9558-44a4a25745bd-dbus-socket\") pod \"nmstate-handler-wd9cv\" (UID: \"8b23693d-f1f9-4ae2-9558-44a4a25745bd\") " pod="openshift-nmstate/nmstate-handler-wd9cv" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.272202 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8b23693d-f1f9-4ae2-9558-44a4a25745bd-ovs-socket\") pod \"nmstate-handler-wd9cv\" (UID: \"8b23693d-f1f9-4ae2-9558-44a4a25745bd\") " pod="openshift-nmstate/nmstate-handler-wd9cv" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.272221 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbfnd\" (UniqueName: \"kubernetes.io/projected/37bb14e1-531b-43cf-b232-c11257dcf690-kube-api-access-lbfnd\") pod \"nmstate-console-plugin-7fbb5f6569-skgcp\" (UID: \"37bb14e1-531b-43cf-b232-c11257dcf690\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.272244 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/37bb14e1-531b-43cf-b232-c11257dcf690-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-skgcp\" (UID: \"37bb14e1-531b-43cf-b232-c11257dcf690\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.272275 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8b23693d-f1f9-4ae2-9558-44a4a25745bd-nmstate-lock\") pod \"nmstate-handler-wd9cv\" (UID: \"8b23693d-f1f9-4ae2-9558-44a4a25745bd\") " pod="openshift-nmstate/nmstate-handler-wd9cv" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.272290 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/37bb14e1-531b-43cf-b232-c11257dcf690-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-skgcp\" (UID: \"37bb14e1-531b-43cf-b232-c11257dcf690\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.272307 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjc6t\" (UniqueName: \"kubernetes.io/projected/8b23693d-f1f9-4ae2-9558-44a4a25745bd-kube-api-access-rjc6t\") pod \"nmstate-handler-wd9cv\" (UID: \"8b23693d-f1f9-4ae2-9558-44a4a25745bd\") " pod="openshift-nmstate/nmstate-handler-wd9cv" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.272652 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8b23693d-f1f9-4ae2-9558-44a4a25745bd-dbus-socket\") pod \"nmstate-handler-wd9cv\" (UID: \"8b23693d-f1f9-4ae2-9558-44a4a25745bd\") " pod="openshift-nmstate/nmstate-handler-wd9cv" Dec 02 20:13:40 crc kubenswrapper[4807]: E1202 20:13:40.272697 4807 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 02 20:13:40 crc kubenswrapper[4807]: E1202 20:13:40.272906 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37bb14e1-531b-43cf-b232-c11257dcf690-plugin-serving-cert podName:37bb14e1-531b-43cf-b232-c11257dcf690 nodeName:}" failed. No retries permitted until 2025-12-02 20:13:40.772883351 +0000 UTC m=+956.073790846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/37bb14e1-531b-43cf-b232-c11257dcf690-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-skgcp" (UID: "37bb14e1-531b-43cf-b232-c11257dcf690") : secret "plugin-serving-cert" not found Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.272740 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8b23693d-f1f9-4ae2-9558-44a4a25745bd-ovs-socket\") pod \"nmstate-handler-wd9cv\" (UID: \"8b23693d-f1f9-4ae2-9558-44a4a25745bd\") " pod="openshift-nmstate/nmstate-handler-wd9cv" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.273297 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8b23693d-f1f9-4ae2-9558-44a4a25745bd-nmstate-lock\") pod \"nmstate-handler-wd9cv\" (UID: \"8b23693d-f1f9-4ae2-9558-44a4a25745bd\") " pod="openshift-nmstate/nmstate-handler-wd9cv" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.273840 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/37bb14e1-531b-43cf-b232-c11257dcf690-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-skgcp\" (UID: \"37bb14e1-531b-43cf-b232-c11257dcf690\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.281449 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-859bbdd485-5q7m9"] Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.282468 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.299244 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbfnd\" (UniqueName: \"kubernetes.io/projected/37bb14e1-531b-43cf-b232-c11257dcf690-kube-api-access-lbfnd\") pod \"nmstate-console-plugin-7fbb5f6569-skgcp\" (UID: \"37bb14e1-531b-43cf-b232-c11257dcf690\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.301573 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-859bbdd485-5q7m9"] Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.302497 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjc6t\" (UniqueName: \"kubernetes.io/projected/8b23693d-f1f9-4ae2-9558-44a4a25745bd-kube-api-access-rjc6t\") pod \"nmstate-handler-wd9cv\" (UID: \"8b23693d-f1f9-4ae2-9558-44a4a25745bd\") " pod="openshift-nmstate/nmstate-handler-wd9cv" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.478533 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9crfz\" (UniqueName: \"kubernetes.io/projected/72d6902d-b5bc-49a7-b067-46a1177cbfd9-kube-api-access-9crfz\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.478590 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/72d6902d-b5bc-49a7-b067-46a1177cbfd9-oauth-serving-cert\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.478649 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/72d6902d-b5bc-49a7-b067-46a1177cbfd9-service-ca\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.478685 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/72d6902d-b5bc-49a7-b067-46a1177cbfd9-console-oauth-config\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.478707 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/72d6902d-b5bc-49a7-b067-46a1177cbfd9-console-config\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.478756 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/72d6902d-b5bc-49a7-b067-46a1177cbfd9-console-serving-cert\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.478788 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72d6902d-b5bc-49a7-b067-46a1177cbfd9-trusted-ca-bundle\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.575214 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-6vnbp"] Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.579810 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/72d6902d-b5bc-49a7-b067-46a1177cbfd9-console-serving-cert\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.579854 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72d6902d-b5bc-49a7-b067-46a1177cbfd9-trusted-ca-bundle\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.579876 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4a32ad98-5354-49e6-957e-ad0828445a24-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-k8qmj\" (UID: \"4a32ad98-5354-49e6-957e-ad0828445a24\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.579901 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9crfz\" (UniqueName: \"kubernetes.io/projected/72d6902d-b5bc-49a7-b067-46a1177cbfd9-kube-api-access-9crfz\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.579918 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/72d6902d-b5bc-49a7-b067-46a1177cbfd9-oauth-serving-cert\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.579966 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/72d6902d-b5bc-49a7-b067-46a1177cbfd9-service-ca\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.579995 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/72d6902d-b5bc-49a7-b067-46a1177cbfd9-console-oauth-config\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.580010 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/72d6902d-b5bc-49a7-b067-46a1177cbfd9-console-config\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.580834 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/72d6902d-b5bc-49a7-b067-46a1177cbfd9-console-config\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.582096 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72d6902d-b5bc-49a7-b067-46a1177cbfd9-trusted-ca-bundle\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.582414 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/72d6902d-b5bc-49a7-b067-46a1177cbfd9-oauth-serving-cert\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.582697 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/72d6902d-b5bc-49a7-b067-46a1177cbfd9-service-ca\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.588190 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/72d6902d-b5bc-49a7-b067-46a1177cbfd9-console-oauth-config\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.588659 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wd9cv" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.603878 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/72d6902d-b5bc-49a7-b067-46a1177cbfd9-console-serving-cert\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.609400 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4a32ad98-5354-49e6-957e-ad0828445a24-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-k8qmj\" (UID: \"4a32ad98-5354-49e6-957e-ad0828445a24\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.620619 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9crfz\" (UniqueName: \"kubernetes.io/projected/72d6902d-b5bc-49a7-b067-46a1177cbfd9-kube-api-access-9crfz\") pod \"console-859bbdd485-5q7m9\" (UID: \"72d6902d-b5bc-49a7-b067-46a1177cbfd9\") " pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: W1202 20:13:40.623701 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b23693d_f1f9_4ae2_9558_44a4a25745bd.slice/crio-d909b0e2237723f305c2706f171c66ea7ebd41ab9777d5baa0a2e1074a91ff1c WatchSource:0}: Error finding container d909b0e2237723f305c2706f171c66ea7ebd41ab9777d5baa0a2e1074a91ff1c: Status 404 returned error can't find the container with id d909b0e2237723f305c2706f171c66ea7ebd41ab9777d5baa0a2e1074a91ff1c Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.647199 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.782147 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/37bb14e1-531b-43cf-b232-c11257dcf690-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-skgcp\" (UID: \"37bb14e1-531b-43cf-b232-c11257dcf690\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.786181 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/37bb14e1-531b-43cf-b232-c11257dcf690-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-skgcp\" (UID: \"37bb14e1-531b-43cf-b232-c11257dcf690\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp" Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.859976 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-859bbdd485-5q7m9"] Dec 02 20:13:40 crc kubenswrapper[4807]: W1202 20:13:40.871250 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72d6902d_b5bc_49a7_b067_46a1177cbfd9.slice/crio-5fab75051b7ef42c8bbd94daa99a26a48f6f85eca88a35cf65bef6d8828be8f7 WatchSource:0}: Error finding container 5fab75051b7ef42c8bbd94daa99a26a48f6f85eca88a35cf65bef6d8828be8f7: Status 404 returned error can't find the container with id 5fab75051b7ef42c8bbd94daa99a26a48f6f85eca88a35cf65bef6d8828be8f7 Dec 02 20:13:40 crc kubenswrapper[4807]: I1202 20:13:40.872460 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj" Dec 02 20:13:41 crc kubenswrapper[4807]: I1202 20:13:41.039556 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp" Dec 02 20:13:41 crc kubenswrapper[4807]: I1202 20:13:41.069096 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj"] Dec 02 20:13:41 crc kubenswrapper[4807]: W1202 20:13:41.082584 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a32ad98_5354_49e6_957e_ad0828445a24.slice/crio-8b8bf05e8ac606c5b871b0d68f1d4e2c79a51373ae45470541a55afcb156bda6 WatchSource:0}: Error finding container 8b8bf05e8ac606c5b871b0d68f1d4e2c79a51373ae45470541a55afcb156bda6: Status 404 returned error can't find the container with id 8b8bf05e8ac606c5b871b0d68f1d4e2c79a51373ae45470541a55afcb156bda6 Dec 02 20:13:41 crc kubenswrapper[4807]: I1202 20:13:41.438214 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp"] Dec 02 20:13:41 crc kubenswrapper[4807]: I1202 20:13:41.592870 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6vnbp" event={"ID":"01df7c9d-768d-417f-a7ed-7865655d889d","Type":"ContainerStarted","Data":"191d9599f3123e635f01b6ae28fe1c42170fc21472c5606b2bee956fa72ccd92"} Dec 02 20:13:41 crc kubenswrapper[4807]: I1202 20:13:41.595086 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wd9cv" event={"ID":"8b23693d-f1f9-4ae2-9558-44a4a25745bd","Type":"ContainerStarted","Data":"d909b0e2237723f305c2706f171c66ea7ebd41ab9777d5baa0a2e1074a91ff1c"} Dec 02 20:13:41 crc kubenswrapper[4807]: I1202 20:13:41.598059 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-859bbdd485-5q7m9" event={"ID":"72d6902d-b5bc-49a7-b067-46a1177cbfd9","Type":"ContainerStarted","Data":"cd1bd6acb7eb983a49ea8b4e52fb6dc8fb00978cd47199800f65520a1f35221f"} Dec 02 20:13:41 crc kubenswrapper[4807]: I1202 20:13:41.598114 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-859bbdd485-5q7m9" event={"ID":"72d6902d-b5bc-49a7-b067-46a1177cbfd9","Type":"ContainerStarted","Data":"5fab75051b7ef42c8bbd94daa99a26a48f6f85eca88a35cf65bef6d8828be8f7"} Dec 02 20:13:41 crc kubenswrapper[4807]: I1202 20:13:41.600166 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj" event={"ID":"4a32ad98-5354-49e6-957e-ad0828445a24","Type":"ContainerStarted","Data":"8b8bf05e8ac606c5b871b0d68f1d4e2c79a51373ae45470541a55afcb156bda6"} Dec 02 20:13:41 crc kubenswrapper[4807]: I1202 20:13:41.602029 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp" event={"ID":"37bb14e1-531b-43cf-b232-c11257dcf690","Type":"ContainerStarted","Data":"f21bcbb5454e5ceb67de931439d95fbed52324421d60943c77263054a2ae8dc0"} Dec 02 20:13:41 crc kubenswrapper[4807]: I1202 20:13:41.629824 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-859bbdd485-5q7m9" podStartSLOduration=1.629796593 podStartE2EDuration="1.629796593s" podCreationTimestamp="2025-12-02 20:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:13:41.626566412 +0000 UTC m=+956.927473957" watchObservedRunningTime="2025-12-02 20:13:41.629796593 +0000 UTC m=+956.930704098" Dec 02 20:13:44 crc kubenswrapper[4807]: I1202 20:13:44.622596 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj" event={"ID":"4a32ad98-5354-49e6-957e-ad0828445a24","Type":"ContainerStarted","Data":"88cfd4e50584ded33681790267015666358d278029e7e180d5ab7b78f9d0062d"} Dec 02 20:13:44 crc kubenswrapper[4807]: I1202 20:13:44.623612 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj" Dec 02 20:13:44 crc kubenswrapper[4807]: I1202 20:13:44.624605 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6vnbp" event={"ID":"01df7c9d-768d-417f-a7ed-7865655d889d","Type":"ContainerStarted","Data":"9e37f16cb075d903d99e7f09d77badbd3016158998ddd505382c7dab067668ca"} Dec 02 20:13:44 crc kubenswrapper[4807]: I1202 20:13:44.647859 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj" podStartSLOduration=2.385266268 podStartE2EDuration="5.647837145s" podCreationTimestamp="2025-12-02 20:13:39 +0000 UTC" firstStartedPulling="2025-12-02 20:13:41.085134617 +0000 UTC m=+956.386042112" lastFinishedPulling="2025-12-02 20:13:44.347705494 +0000 UTC m=+959.648612989" observedRunningTime="2025-12-02 20:13:44.646490636 +0000 UTC m=+959.947398131" watchObservedRunningTime="2025-12-02 20:13:44.647837145 +0000 UTC m=+959.948744640" Dec 02 20:13:45 crc kubenswrapper[4807]: I1202 20:13:45.632219 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wd9cv" event={"ID":"8b23693d-f1f9-4ae2-9558-44a4a25745bd","Type":"ContainerStarted","Data":"8cadccc5ec507f95dd78e77db0fc15c55c40968e4ef2775da3763e10c486e666"} Dec 02 20:13:45 crc kubenswrapper[4807]: I1202 20:13:45.652430 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-wd9cv" podStartSLOduration=2.944904978 podStartE2EDuration="6.652411958s" podCreationTimestamp="2025-12-02 20:13:39 +0000 UTC" firstStartedPulling="2025-12-02 20:13:40.625652682 +0000 UTC m=+955.926560177" lastFinishedPulling="2025-12-02 20:13:44.333159672 +0000 UTC m=+959.634067157" observedRunningTime="2025-12-02 20:13:45.650549255 +0000 UTC m=+960.951456750" watchObservedRunningTime="2025-12-02 20:13:45.652411958 +0000 UTC m=+960.953319453" Dec 02 20:13:46 crc kubenswrapper[4807]: I1202 20:13:46.640160 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp" event={"ID":"37bb14e1-531b-43cf-b232-c11257dcf690","Type":"ContainerStarted","Data":"8e64796a0f2be5396880c611aa85563a3a70c63f114b363febcc16afbcdede2c"} Dec 02 20:13:46 crc kubenswrapper[4807]: I1202 20:13:46.640780 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-wd9cv" Dec 02 20:13:46 crc kubenswrapper[4807]: I1202 20:13:46.670190 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-skgcp" podStartSLOduration=2.462802526 podStartE2EDuration="6.670161133s" podCreationTimestamp="2025-12-02 20:13:40 +0000 UTC" firstStartedPulling="2025-12-02 20:13:41.447232643 +0000 UTC m=+956.748140138" lastFinishedPulling="2025-12-02 20:13:45.65459125 +0000 UTC m=+960.955498745" observedRunningTime="2025-12-02 20:13:46.657294759 +0000 UTC m=+961.958202264" watchObservedRunningTime="2025-12-02 20:13:46.670161133 +0000 UTC m=+961.971068628" Dec 02 20:13:47 crc kubenswrapper[4807]: I1202 20:13:47.663409 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6vnbp" event={"ID":"01df7c9d-768d-417f-a7ed-7865655d889d","Type":"ContainerStarted","Data":"16ce54c44632ed9bbfcc9a0437b998293492796148f680099e73c7e9e0a2cd22"} Dec 02 20:13:47 crc kubenswrapper[4807]: I1202 20:13:47.704525 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6vnbp" podStartSLOduration=2.050616988 podStartE2EDuration="8.704501809s" podCreationTimestamp="2025-12-02 20:13:39 +0000 UTC" firstStartedPulling="2025-12-02 20:13:40.590512127 +0000 UTC m=+955.891419622" lastFinishedPulling="2025-12-02 20:13:47.244396918 +0000 UTC m=+962.545304443" observedRunningTime="2025-12-02 20:13:47.700514046 +0000 UTC m=+963.001421541" watchObservedRunningTime="2025-12-02 20:13:47.704501809 +0000 UTC m=+963.005409304" Dec 02 20:13:50 crc kubenswrapper[4807]: I1202 20:13:50.614096 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-wd9cv" Dec 02 20:13:50 crc kubenswrapper[4807]: I1202 20:13:50.647923 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:50 crc kubenswrapper[4807]: I1202 20:13:50.647983 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:50 crc kubenswrapper[4807]: I1202 20:13:50.656158 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:50 crc kubenswrapper[4807]: I1202 20:13:50.687865 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-859bbdd485-5q7m9" Dec 02 20:13:50 crc kubenswrapper[4807]: I1202 20:13:50.750248 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-bjmsc"] Dec 02 20:13:58 crc kubenswrapper[4807]: I1202 20:13:58.293088 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:13:58 crc kubenswrapper[4807]: I1202 20:13:58.293747 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:14:00 crc kubenswrapper[4807]: I1202 20:14:00.880194 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k8qmj" Dec 02 20:14:14 crc kubenswrapper[4807]: I1202 20:14:14.790665 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g"] Dec 02 20:14:14 crc kubenswrapper[4807]: I1202 20:14:14.793485 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" Dec 02 20:14:14 crc kubenswrapper[4807]: I1202 20:14:14.798529 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 20:14:14 crc kubenswrapper[4807]: I1202 20:14:14.814923 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g"] Dec 02 20:14:14 crc kubenswrapper[4807]: I1202 20:14:14.894362 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l75pv\" (UniqueName: \"kubernetes.io/projected/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-kube-api-access-l75pv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g\" (UID: \"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" Dec 02 20:14:14 crc kubenswrapper[4807]: I1202 20:14:14.894432 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g\" (UID: \"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" Dec 02 20:14:14 crc kubenswrapper[4807]: I1202 20:14:14.895113 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g\" (UID: \"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" Dec 02 20:14:14 crc kubenswrapper[4807]: I1202 20:14:14.995876 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g\" (UID: \"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" Dec 02 20:14:14 crc kubenswrapper[4807]: I1202 20:14:14.995948 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g\" (UID: \"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" Dec 02 20:14:14 crc kubenswrapper[4807]: I1202 20:14:14.996023 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l75pv\" (UniqueName: \"kubernetes.io/projected/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-kube-api-access-l75pv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g\" (UID: \"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" Dec 02 20:14:14 crc kubenswrapper[4807]: I1202 20:14:14.996841 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g\" (UID: \"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" Dec 02 20:14:14 crc kubenswrapper[4807]: I1202 20:14:14.996861 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g\" (UID: \"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" Dec 02 20:14:15 crc kubenswrapper[4807]: I1202 20:14:15.025214 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l75pv\" (UniqueName: \"kubernetes.io/projected/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-kube-api-access-l75pv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g\" (UID: \"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" Dec 02 20:14:15 crc kubenswrapper[4807]: I1202 20:14:15.110754 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" Dec 02 20:14:15 crc kubenswrapper[4807]: I1202 20:14:15.356422 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g"] Dec 02 20:14:15 crc kubenswrapper[4807]: I1202 20:14:15.803331 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-bjmsc" podUID="d31ed2df-d4fa-4b71-a218-20d453f1d8cb" containerName="console" containerID="cri-o://b6b6454cfca2064541c8070f2153b92d971bf00df9198255955fc29f2945da5e" gracePeriod=15 Dec 02 20:14:15 crc kubenswrapper[4807]: I1202 20:14:15.867798 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" event={"ID":"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829","Type":"ContainerStarted","Data":"72c4861de8757e80f183a4d9b149abf39b0c783c15a97b3a7b1357c0fd545067"} Dec 02 20:14:15 crc kubenswrapper[4807]: I1202 20:14:15.867855 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" event={"ID":"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829","Type":"ContainerStarted","Data":"4e405be9f3839b6c0c73e725635b5fe3160835c5772024bf0d31b6f23477e86b"} Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.162371 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-bjmsc_d31ed2df-d4fa-4b71-a218-20d453f1d8cb/console/0.log" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.162457 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.315272 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smj9s\" (UniqueName: \"kubernetes.io/projected/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-kube-api-access-smj9s\") pod \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.315327 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-service-ca\") pod \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.315387 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-oauth-serving-cert\") pod \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.315445 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-oauth-config\") pod \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.315496 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-config\") pod \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.315519 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-serving-cert\") pod \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.315546 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-trusted-ca-bundle\") pod \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\" (UID: \"d31ed2df-d4fa-4b71-a218-20d453f1d8cb\") " Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.316372 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-config" (OuterVolumeSpecName: "console-config") pod "d31ed2df-d4fa-4b71-a218-20d453f1d8cb" (UID: "d31ed2df-d4fa-4b71-a218-20d453f1d8cb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.316539 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d31ed2df-d4fa-4b71-a218-20d453f1d8cb" (UID: "d31ed2df-d4fa-4b71-a218-20d453f1d8cb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.316668 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d31ed2df-d4fa-4b71-a218-20d453f1d8cb" (UID: "d31ed2df-d4fa-4b71-a218-20d453f1d8cb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.316762 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-service-ca" (OuterVolumeSpecName: "service-ca") pod "d31ed2df-d4fa-4b71-a218-20d453f1d8cb" (UID: "d31ed2df-d4fa-4b71-a218-20d453f1d8cb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.321337 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d31ed2df-d4fa-4b71-a218-20d453f1d8cb" (UID: "d31ed2df-d4fa-4b71-a218-20d453f1d8cb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.321594 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d31ed2df-d4fa-4b71-a218-20d453f1d8cb" (UID: "d31ed2df-d4fa-4b71-a218-20d453f1d8cb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.322005 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-kube-api-access-smj9s" (OuterVolumeSpecName: "kube-api-access-smj9s") pod "d31ed2df-d4fa-4b71-a218-20d453f1d8cb" (UID: "d31ed2df-d4fa-4b71-a218-20d453f1d8cb"). InnerVolumeSpecName "kube-api-access-smj9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.416943 4807 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.416975 4807 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.416985 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.416993 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smj9s\" (UniqueName: \"kubernetes.io/projected/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-kube-api-access-smj9s\") on node \"crc\" DevicePath \"\"" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.417003 4807 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.417011 4807 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.417019 4807 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d31ed2df-d4fa-4b71-a218-20d453f1d8cb-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.877912 4807 generic.go:334] "Generic (PLEG): container finished" podID="9a9aee04-5b5a-4c8b-a0e8-b16ed522d829" containerID="72c4861de8757e80f183a4d9b149abf39b0c783c15a97b3a7b1357c0fd545067" exitCode=0 Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.877999 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" event={"ID":"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829","Type":"ContainerDied","Data":"72c4861de8757e80f183a4d9b149abf39b0c783c15a97b3a7b1357c0fd545067"} Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.880701 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-bjmsc_d31ed2df-d4fa-4b71-a218-20d453f1d8cb/console/0.log" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.880787 4807 generic.go:334] "Generic (PLEG): container finished" podID="d31ed2df-d4fa-4b71-a218-20d453f1d8cb" containerID="b6b6454cfca2064541c8070f2153b92d971bf00df9198255955fc29f2945da5e" exitCode=2 Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.880828 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bjmsc" event={"ID":"d31ed2df-d4fa-4b71-a218-20d453f1d8cb","Type":"ContainerDied","Data":"b6b6454cfca2064541c8070f2153b92d971bf00df9198255955fc29f2945da5e"} Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.880873 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-bjmsc" event={"ID":"d31ed2df-d4fa-4b71-a218-20d453f1d8cb","Type":"ContainerDied","Data":"672c8f69b1e38276e29a838b77aff72f952ebeb40ba4d0dfaa187b3dfc318d38"} Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.880903 4807 scope.go:117] "RemoveContainer" containerID="b6b6454cfca2064541c8070f2153b92d971bf00df9198255955fc29f2945da5e" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.880913 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-bjmsc" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.904760 4807 scope.go:117] "RemoveContainer" containerID="b6b6454cfca2064541c8070f2153b92d971bf00df9198255955fc29f2945da5e" Dec 02 20:14:16 crc kubenswrapper[4807]: E1202 20:14:16.916515 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6b6454cfca2064541c8070f2153b92d971bf00df9198255955fc29f2945da5e\": container with ID starting with b6b6454cfca2064541c8070f2153b92d971bf00df9198255955fc29f2945da5e not found: ID does not exist" containerID="b6b6454cfca2064541c8070f2153b92d971bf00df9198255955fc29f2945da5e" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.916620 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b6454cfca2064541c8070f2153b92d971bf00df9198255955fc29f2945da5e"} err="failed to get container status \"b6b6454cfca2064541c8070f2153b92d971bf00df9198255955fc29f2945da5e\": rpc error: code = NotFound desc = could not find container \"b6b6454cfca2064541c8070f2153b92d971bf00df9198255955fc29f2945da5e\": container with ID starting with b6b6454cfca2064541c8070f2153b92d971bf00df9198255955fc29f2945da5e not found: ID does not exist" Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.941951 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-bjmsc"] Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.946980 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-bjmsc"] Dec 02 20:14:16 crc kubenswrapper[4807]: I1202 20:14:16.981870 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d31ed2df-d4fa-4b71-a218-20d453f1d8cb" path="/var/lib/kubelet/pods/d31ed2df-d4fa-4b71-a218-20d453f1d8cb/volumes" Dec 02 20:14:18 crc kubenswrapper[4807]: I1202 20:14:18.910211 4807 generic.go:334] "Generic (PLEG): container finished" podID="9a9aee04-5b5a-4c8b-a0e8-b16ed522d829" containerID="060d90cdfb92fb765498101fefa000dc0d2ec070e4178520dadd5843758affe7" exitCode=0 Dec 02 20:14:18 crc kubenswrapper[4807]: I1202 20:14:18.910293 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" event={"ID":"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829","Type":"ContainerDied","Data":"060d90cdfb92fb765498101fefa000dc0d2ec070e4178520dadd5843758affe7"} Dec 02 20:14:19 crc kubenswrapper[4807]: I1202 20:14:19.917699 4807 generic.go:334] "Generic (PLEG): container finished" podID="9a9aee04-5b5a-4c8b-a0e8-b16ed522d829" containerID="169c460502c7a926064db2ee189056013b85acb9f7d4f9a7b99fba9092dbc935" exitCode=0 Dec 02 20:14:19 crc kubenswrapper[4807]: I1202 20:14:19.917787 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" event={"ID":"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829","Type":"ContainerDied","Data":"169c460502c7a926064db2ee189056013b85acb9f7d4f9a7b99fba9092dbc935"} Dec 02 20:14:21 crc kubenswrapper[4807]: I1202 20:14:21.163974 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" Dec 02 20:14:21 crc kubenswrapper[4807]: I1202 20:14:21.285481 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-bundle\") pod \"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829\" (UID: \"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829\") " Dec 02 20:14:21 crc kubenswrapper[4807]: I1202 20:14:21.285864 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l75pv\" (UniqueName: \"kubernetes.io/projected/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-kube-api-access-l75pv\") pod \"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829\" (UID: \"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829\") " Dec 02 20:14:21 crc kubenswrapper[4807]: I1202 20:14:21.285950 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-util\") pod \"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829\" (UID: \"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829\") " Dec 02 20:14:21 crc kubenswrapper[4807]: I1202 20:14:21.286937 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-bundle" (OuterVolumeSpecName: "bundle") pod "9a9aee04-5b5a-4c8b-a0e8-b16ed522d829" (UID: "9a9aee04-5b5a-4c8b-a0e8-b16ed522d829"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:14:21 crc kubenswrapper[4807]: I1202 20:14:21.292631 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-kube-api-access-l75pv" (OuterVolumeSpecName: "kube-api-access-l75pv") pod "9a9aee04-5b5a-4c8b-a0e8-b16ed522d829" (UID: "9a9aee04-5b5a-4c8b-a0e8-b16ed522d829"). InnerVolumeSpecName "kube-api-access-l75pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:14:21 crc kubenswrapper[4807]: I1202 20:14:21.387230 4807 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:14:21 crc kubenswrapper[4807]: I1202 20:14:21.387268 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l75pv\" (UniqueName: \"kubernetes.io/projected/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-kube-api-access-l75pv\") on node \"crc\" DevicePath \"\"" Dec 02 20:14:21 crc kubenswrapper[4807]: I1202 20:14:21.461136 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-util" (OuterVolumeSpecName: "util") pod "9a9aee04-5b5a-4c8b-a0e8-b16ed522d829" (UID: "9a9aee04-5b5a-4c8b-a0e8-b16ed522d829"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:14:21 crc kubenswrapper[4807]: I1202 20:14:21.488845 4807 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9a9aee04-5b5a-4c8b-a0e8-b16ed522d829-util\") on node \"crc\" DevicePath \"\"" Dec 02 20:14:21 crc kubenswrapper[4807]: I1202 20:14:21.935366 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" event={"ID":"9a9aee04-5b5a-4c8b-a0e8-b16ed522d829","Type":"ContainerDied","Data":"4e405be9f3839b6c0c73e725635b5fe3160835c5772024bf0d31b6f23477e86b"} Dec 02 20:14:21 crc kubenswrapper[4807]: I1202 20:14:21.935786 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e405be9f3839b6c0c73e725635b5fe3160835c5772024bf0d31b6f23477e86b" Dec 02 20:14:21 crc kubenswrapper[4807]: I1202 20:14:21.935531 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g" Dec 02 20:14:28 crc kubenswrapper[4807]: I1202 20:14:28.292865 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:14:28 crc kubenswrapper[4807]: I1202 20:14:28.293151 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:14:28 crc kubenswrapper[4807]: I1202 20:14:28.293214 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 20:14:28 crc kubenswrapper[4807]: I1202 20:14:28.293815 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e7e961064ac2bf5a3d6cced9f51594218b6b12f9bc4f97979713bb99fd8aad69"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 20:14:28 crc kubenswrapper[4807]: I1202 20:14:28.293875 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://e7e961064ac2bf5a3d6cced9f51594218b6b12f9bc4f97979713bb99fd8aad69" gracePeriod=600 Dec 02 20:14:28 crc kubenswrapper[4807]: I1202 20:14:28.982010 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="e7e961064ac2bf5a3d6cced9f51594218b6b12f9bc4f97979713bb99fd8aad69" exitCode=0 Dec 02 20:14:28 crc kubenswrapper[4807]: I1202 20:14:28.982534 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"e7e961064ac2bf5a3d6cced9f51594218b6b12f9bc4f97979713bb99fd8aad69"} Dec 02 20:14:28 crc kubenswrapper[4807]: I1202 20:14:28.982603 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"38f5416dfc921b3e8d35befa1ba790fe16c356edf6ecbb6687773608446c2497"} Dec 02 20:14:28 crc kubenswrapper[4807]: I1202 20:14:28.982632 4807 scope.go:117] "RemoveContainer" containerID="3f19a7fc149c326bcfbb3a1b01b23ae680a3166497ae907c41fb2e7aadf72abf" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.077091 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7"] Dec 02 20:14:31 crc kubenswrapper[4807]: E1202 20:14:31.078236 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d31ed2df-d4fa-4b71-a218-20d453f1d8cb" containerName="console" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.078250 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d31ed2df-d4fa-4b71-a218-20d453f1d8cb" containerName="console" Dec 02 20:14:31 crc kubenswrapper[4807]: E1202 20:14:31.078260 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9aee04-5b5a-4c8b-a0e8-b16ed522d829" containerName="extract" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.078266 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9aee04-5b5a-4c8b-a0e8-b16ed522d829" containerName="extract" Dec 02 20:14:31 crc kubenswrapper[4807]: E1202 20:14:31.078279 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9aee04-5b5a-4c8b-a0e8-b16ed522d829" containerName="pull" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.078288 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9aee04-5b5a-4c8b-a0e8-b16ed522d829" containerName="pull" Dec 02 20:14:31 crc kubenswrapper[4807]: E1202 20:14:31.078304 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9aee04-5b5a-4c8b-a0e8-b16ed522d829" containerName="util" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.078311 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9aee04-5b5a-4c8b-a0e8-b16ed522d829" containerName="util" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.078441 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a9aee04-5b5a-4c8b-a0e8-b16ed522d829" containerName="extract" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.078457 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d31ed2df-d4fa-4b71-a218-20d453f1d8cb" containerName="console" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.079039 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.085611 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.087955 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.088210 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-grm6h" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.088332 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.089608 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.098245 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7"] Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.141261 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7267c32-ac50-4ee6-8766-f9e586c3bf39-apiservice-cert\") pod \"metallb-operator-controller-manager-7684466ddc-qvkp7\" (UID: \"a7267c32-ac50-4ee6-8766-f9e586c3bf39\") " pod="metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.141339 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7267c32-ac50-4ee6-8766-f9e586c3bf39-webhook-cert\") pod \"metallb-operator-controller-manager-7684466ddc-qvkp7\" (UID: \"a7267c32-ac50-4ee6-8766-f9e586c3bf39\") " pod="metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.141574 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z5fq\" (UniqueName: \"kubernetes.io/projected/a7267c32-ac50-4ee6-8766-f9e586c3bf39-kube-api-access-7z5fq\") pod \"metallb-operator-controller-manager-7684466ddc-qvkp7\" (UID: \"a7267c32-ac50-4ee6-8766-f9e586c3bf39\") " pod="metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.242491 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z5fq\" (UniqueName: \"kubernetes.io/projected/a7267c32-ac50-4ee6-8766-f9e586c3bf39-kube-api-access-7z5fq\") pod \"metallb-operator-controller-manager-7684466ddc-qvkp7\" (UID: \"a7267c32-ac50-4ee6-8766-f9e586c3bf39\") " pod="metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.242563 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7267c32-ac50-4ee6-8766-f9e586c3bf39-apiservice-cert\") pod \"metallb-operator-controller-manager-7684466ddc-qvkp7\" (UID: \"a7267c32-ac50-4ee6-8766-f9e586c3bf39\") " pod="metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.242603 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7267c32-ac50-4ee6-8766-f9e586c3bf39-webhook-cert\") pod \"metallb-operator-controller-manager-7684466ddc-qvkp7\" (UID: \"a7267c32-ac50-4ee6-8766-f9e586c3bf39\") " pod="metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.249451 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7267c32-ac50-4ee6-8766-f9e586c3bf39-apiservice-cert\") pod \"metallb-operator-controller-manager-7684466ddc-qvkp7\" (UID: \"a7267c32-ac50-4ee6-8766-f9e586c3bf39\") " pod="metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.262589 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7267c32-ac50-4ee6-8766-f9e586c3bf39-webhook-cert\") pod \"metallb-operator-controller-manager-7684466ddc-qvkp7\" (UID: \"a7267c32-ac50-4ee6-8766-f9e586c3bf39\") " pod="metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.269066 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z5fq\" (UniqueName: \"kubernetes.io/projected/a7267c32-ac50-4ee6-8766-f9e586c3bf39-kube-api-access-7z5fq\") pod \"metallb-operator-controller-manager-7684466ddc-qvkp7\" (UID: \"a7267c32-ac50-4ee6-8766-f9e586c3bf39\") " pod="metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.396563 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.587498 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv"] Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.588967 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.593209 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-6pwkg" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.593525 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.593819 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.600572 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv"] Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.649355 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrmpk\" (UniqueName: \"kubernetes.io/projected/40eff5fb-df4c-47e1-bddf-ec09d648f511-kube-api-access-mrmpk\") pod \"metallb-operator-webhook-server-9c8db4b6b-74mvv\" (UID: \"40eff5fb-df4c-47e1-bddf-ec09d648f511\") " pod="metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.649768 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/40eff5fb-df4c-47e1-bddf-ec09d648f511-webhook-cert\") pod \"metallb-operator-webhook-server-9c8db4b6b-74mvv\" (UID: \"40eff5fb-df4c-47e1-bddf-ec09d648f511\") " pod="metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.649865 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/40eff5fb-df4c-47e1-bddf-ec09d648f511-apiservice-cert\") pod \"metallb-operator-webhook-server-9c8db4b6b-74mvv\" (UID: \"40eff5fb-df4c-47e1-bddf-ec09d648f511\") " pod="metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.729495 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7"] Dec 02 20:14:31 crc kubenswrapper[4807]: W1202 20:14:31.737369 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7267c32_ac50_4ee6_8766_f9e586c3bf39.slice/crio-3479bc79c96411c141b85a14702cca676769992659981991184f1dfe229dc8a9 WatchSource:0}: Error finding container 3479bc79c96411c141b85a14702cca676769992659981991184f1dfe229dc8a9: Status 404 returned error can't find the container with id 3479bc79c96411c141b85a14702cca676769992659981991184f1dfe229dc8a9 Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.750987 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrmpk\" (UniqueName: \"kubernetes.io/projected/40eff5fb-df4c-47e1-bddf-ec09d648f511-kube-api-access-mrmpk\") pod \"metallb-operator-webhook-server-9c8db4b6b-74mvv\" (UID: \"40eff5fb-df4c-47e1-bddf-ec09d648f511\") " pod="metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.751050 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/40eff5fb-df4c-47e1-bddf-ec09d648f511-webhook-cert\") pod \"metallb-operator-webhook-server-9c8db4b6b-74mvv\" (UID: \"40eff5fb-df4c-47e1-bddf-ec09d648f511\") " pod="metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.751070 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/40eff5fb-df4c-47e1-bddf-ec09d648f511-apiservice-cert\") pod \"metallb-operator-webhook-server-9c8db4b6b-74mvv\" (UID: \"40eff5fb-df4c-47e1-bddf-ec09d648f511\") " pod="metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.755536 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/40eff5fb-df4c-47e1-bddf-ec09d648f511-apiservice-cert\") pod \"metallb-operator-webhook-server-9c8db4b6b-74mvv\" (UID: \"40eff5fb-df4c-47e1-bddf-ec09d648f511\") " pod="metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.755679 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/40eff5fb-df4c-47e1-bddf-ec09d648f511-webhook-cert\") pod \"metallb-operator-webhook-server-9c8db4b6b-74mvv\" (UID: \"40eff5fb-df4c-47e1-bddf-ec09d648f511\") " pod="metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.773195 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrmpk\" (UniqueName: \"kubernetes.io/projected/40eff5fb-df4c-47e1-bddf-ec09d648f511-kube-api-access-mrmpk\") pod \"metallb-operator-webhook-server-9c8db4b6b-74mvv\" (UID: \"40eff5fb-df4c-47e1-bddf-ec09d648f511\") " pod="metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv" Dec 02 20:14:31 crc kubenswrapper[4807]: I1202 20:14:31.930134 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv" Dec 02 20:14:32 crc kubenswrapper[4807]: I1202 20:14:32.013081 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7" event={"ID":"a7267c32-ac50-4ee6-8766-f9e586c3bf39","Type":"ContainerStarted","Data":"3479bc79c96411c141b85a14702cca676769992659981991184f1dfe229dc8a9"} Dec 02 20:14:32 crc kubenswrapper[4807]: I1202 20:14:32.186504 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv"] Dec 02 20:14:32 crc kubenswrapper[4807]: W1202 20:14:32.195164 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40eff5fb_df4c_47e1_bddf_ec09d648f511.slice/crio-9cb7c336afd6b87f025b07f9aa0cfdf93079548f8fb4f126def9a0950d64ea00 WatchSource:0}: Error finding container 9cb7c336afd6b87f025b07f9aa0cfdf93079548f8fb4f126def9a0950d64ea00: Status 404 returned error can't find the container with id 9cb7c336afd6b87f025b07f9aa0cfdf93079548f8fb4f126def9a0950d64ea00 Dec 02 20:14:33 crc kubenswrapper[4807]: I1202 20:14:33.041962 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv" event={"ID":"40eff5fb-df4c-47e1-bddf-ec09d648f511","Type":"ContainerStarted","Data":"9cb7c336afd6b87f025b07f9aa0cfdf93079548f8fb4f126def9a0950d64ea00"} Dec 02 20:14:35 crc kubenswrapper[4807]: I1202 20:14:35.088137 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7" event={"ID":"a7267c32-ac50-4ee6-8766-f9e586c3bf39","Type":"ContainerStarted","Data":"8ff7599ad42ef9a0d0f3ecc91006ce6a4673fd12a217adc72cfebde0629e713b"} Dec 02 20:14:35 crc kubenswrapper[4807]: I1202 20:14:35.088475 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7" Dec 02 20:14:35 crc kubenswrapper[4807]: I1202 20:14:35.111330 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7" podStartSLOduration=0.992378901 podStartE2EDuration="4.111305985s" podCreationTimestamp="2025-12-02 20:14:31 +0000 UTC" firstStartedPulling="2025-12-02 20:14:31.741612116 +0000 UTC m=+1007.042519611" lastFinishedPulling="2025-12-02 20:14:34.8605392 +0000 UTC m=+1010.161446695" observedRunningTime="2025-12-02 20:14:35.106212106 +0000 UTC m=+1010.407119601" watchObservedRunningTime="2025-12-02 20:14:35.111305985 +0000 UTC m=+1010.412213480" Dec 02 20:14:38 crc kubenswrapper[4807]: I1202 20:14:38.114240 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv" event={"ID":"40eff5fb-df4c-47e1-bddf-ec09d648f511","Type":"ContainerStarted","Data":"95a04b7e5f6a580039da323ff797ba579040423d7bf1787cb94e09ce3740ea98"} Dec 02 20:14:38 crc kubenswrapper[4807]: I1202 20:14:38.114887 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv" Dec 02 20:14:38 crc kubenswrapper[4807]: I1202 20:14:38.142185 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv" podStartSLOduration=1.9512299789999998 podStartE2EDuration="7.142166813s" podCreationTimestamp="2025-12-02 20:14:31 +0000 UTC" firstStartedPulling="2025-12-02 20:14:32.201033965 +0000 UTC m=+1007.501941460" lastFinishedPulling="2025-12-02 20:14:37.391970799 +0000 UTC m=+1012.692878294" observedRunningTime="2025-12-02 20:14:38.137368103 +0000 UTC m=+1013.438275598" watchObservedRunningTime="2025-12-02 20:14:38.142166813 +0000 UTC m=+1013.443074308" Dec 02 20:14:51 crc kubenswrapper[4807]: I1202 20:14:51.938516 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-9c8db4b6b-74mvv" Dec 02 20:15:00 crc kubenswrapper[4807]: I1202 20:15:00.151514 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl"] Dec 02 20:15:00 crc kubenswrapper[4807]: I1202 20:15:00.152784 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl" Dec 02 20:15:00 crc kubenswrapper[4807]: I1202 20:15:00.154980 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 20:15:00 crc kubenswrapper[4807]: I1202 20:15:00.155048 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 20:15:00 crc kubenswrapper[4807]: I1202 20:15:00.161480 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl"] Dec 02 20:15:00 crc kubenswrapper[4807]: I1202 20:15:00.179681 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-config-volume\") pod \"collect-profiles-29411775-nngpl\" (UID: \"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl" Dec 02 20:15:00 crc kubenswrapper[4807]: I1202 20:15:00.179785 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxznr\" (UniqueName: \"kubernetes.io/projected/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-kube-api-access-rxznr\") pod \"collect-profiles-29411775-nngpl\" (UID: \"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl" Dec 02 20:15:00 crc kubenswrapper[4807]: I1202 20:15:00.179904 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-secret-volume\") pod \"collect-profiles-29411775-nngpl\" (UID: \"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl" Dec 02 20:15:00 crc kubenswrapper[4807]: I1202 20:15:00.281340 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-config-volume\") pod \"collect-profiles-29411775-nngpl\" (UID: \"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl" Dec 02 20:15:00 crc kubenswrapper[4807]: I1202 20:15:00.281392 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxznr\" (UniqueName: \"kubernetes.io/projected/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-kube-api-access-rxznr\") pod \"collect-profiles-29411775-nngpl\" (UID: \"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl" Dec 02 20:15:00 crc kubenswrapper[4807]: I1202 20:15:00.281420 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-secret-volume\") pod \"collect-profiles-29411775-nngpl\" (UID: \"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl" Dec 02 20:15:00 crc kubenswrapper[4807]: I1202 20:15:00.283169 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-config-volume\") pod \"collect-profiles-29411775-nngpl\" (UID: \"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl" Dec 02 20:15:00 crc kubenswrapper[4807]: I1202 20:15:00.287618 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-secret-volume\") pod \"collect-profiles-29411775-nngpl\" (UID: \"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl" Dec 02 20:15:00 crc kubenswrapper[4807]: I1202 20:15:00.310994 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxznr\" (UniqueName: \"kubernetes.io/projected/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-kube-api-access-rxznr\") pod \"collect-profiles-29411775-nngpl\" (UID: \"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl" Dec 02 20:15:00 crc kubenswrapper[4807]: I1202 20:15:00.477026 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl" Dec 02 20:15:00 crc kubenswrapper[4807]: I1202 20:15:00.669865 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl"] Dec 02 20:15:01 crc kubenswrapper[4807]: I1202 20:15:01.273883 4807 generic.go:334] "Generic (PLEG): container finished" podID="82089ba4-d2c7-49e1-96e0-bf2d1a082aa0" containerID="97304526de2b565171d77469b97822d3affa6c08115f8ec5697384099d48b94b" exitCode=0 Dec 02 20:15:01 crc kubenswrapper[4807]: I1202 20:15:01.273930 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl" event={"ID":"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0","Type":"ContainerDied","Data":"97304526de2b565171d77469b97822d3affa6c08115f8ec5697384099d48b94b"} Dec 02 20:15:01 crc kubenswrapper[4807]: I1202 20:15:01.273956 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl" event={"ID":"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0","Type":"ContainerStarted","Data":"5de2f0fbbbae5bffae1a51e927e7ccebfbc936d6cd386542d72815515454faab"} Dec 02 20:15:02 crc kubenswrapper[4807]: I1202 20:15:02.612322 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl" Dec 02 20:15:02 crc kubenswrapper[4807]: I1202 20:15:02.722507 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-secret-volume\") pod \"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0\" (UID: \"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0\") " Dec 02 20:15:02 crc kubenswrapper[4807]: I1202 20:15:02.722671 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-config-volume\") pod \"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0\" (UID: \"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0\") " Dec 02 20:15:02 crc kubenswrapper[4807]: I1202 20:15:02.722803 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxznr\" (UniqueName: \"kubernetes.io/projected/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-kube-api-access-rxznr\") pod \"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0\" (UID: \"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0\") " Dec 02 20:15:02 crc kubenswrapper[4807]: I1202 20:15:02.723885 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-config-volume" (OuterVolumeSpecName: "config-volume") pod "82089ba4-d2c7-49e1-96e0-bf2d1a082aa0" (UID: "82089ba4-d2c7-49e1-96e0-bf2d1a082aa0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:15:02 crc kubenswrapper[4807]: I1202 20:15:02.731042 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-kube-api-access-rxznr" (OuterVolumeSpecName: "kube-api-access-rxznr") pod "82089ba4-d2c7-49e1-96e0-bf2d1a082aa0" (UID: "82089ba4-d2c7-49e1-96e0-bf2d1a082aa0"). InnerVolumeSpecName "kube-api-access-rxznr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:15:02 crc kubenswrapper[4807]: I1202 20:15:02.731979 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "82089ba4-d2c7-49e1-96e0-bf2d1a082aa0" (UID: "82089ba4-d2c7-49e1-96e0-bf2d1a082aa0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:15:02 crc kubenswrapper[4807]: I1202 20:15:02.824459 4807 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 20:15:02 crc kubenswrapper[4807]: I1202 20:15:02.824510 4807 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 20:15:02 crc kubenswrapper[4807]: I1202 20:15:02.824523 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxznr\" (UniqueName: \"kubernetes.io/projected/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0-kube-api-access-rxznr\") on node \"crc\" DevicePath \"\"" Dec 02 20:15:03 crc kubenswrapper[4807]: I1202 20:15:03.292919 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl" event={"ID":"82089ba4-d2c7-49e1-96e0-bf2d1a082aa0","Type":"ContainerDied","Data":"5de2f0fbbbae5bffae1a51e927e7ccebfbc936d6cd386542d72815515454faab"} Dec 02 20:15:03 crc kubenswrapper[4807]: I1202 20:15:03.293333 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5de2f0fbbbae5bffae1a51e927e7ccebfbc936d6cd386542d72815515454faab" Dec 02 20:15:03 crc kubenswrapper[4807]: I1202 20:15:03.293095 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl" Dec 02 20:15:11 crc kubenswrapper[4807]: I1202 20:15:11.401093 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7684466ddc-qvkp7" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.249010 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-d5xdx"] Dec 02 20:15:12 crc kubenswrapper[4807]: E1202 20:15:12.249339 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82089ba4-d2c7-49e1-96e0-bf2d1a082aa0" containerName="collect-profiles" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.249361 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="82089ba4-d2c7-49e1-96e0-bf2d1a082aa0" containerName="collect-profiles" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.249520 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="82089ba4-d2c7-49e1-96e0-bf2d1a082aa0" containerName="collect-profiles" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.251921 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.256006 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.256023 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-krvnd" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.256437 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.257758 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-4hldc"] Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.258702 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4hldc" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.260617 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.277308 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-4hldc"] Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.353432 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-vgclf"] Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.356785 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vgclf" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.360566 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.361206 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.361331 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.361447 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-t9cwm" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.367468 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-frr-conf\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.367514 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gngm2\" (UniqueName: \"kubernetes.io/projected/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-kube-api-access-gngm2\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.367534 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13a694d0-f9dd-46f3-84de-5b6c7b6a9e4f-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-4hldc\" (UID: \"13a694d0-f9dd-46f3-84de-5b6c7b6a9e4f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4hldc" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.367563 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqgjb\" (UniqueName: \"kubernetes.io/projected/13a694d0-f9dd-46f3-84de-5b6c7b6a9e4f-kube-api-access-qqgjb\") pod \"frr-k8s-webhook-server-7fcb986d4-4hldc\" (UID: \"13a694d0-f9dd-46f3-84de-5b6c7b6a9e4f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4hldc" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.367587 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-metrics-certs\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.367617 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-reloader\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.367633 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-metrics\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.367673 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-frr-startup\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.367698 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-frr-sockets\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.369944 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-qwrnk"] Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.371796 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-qwrnk" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.375199 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.395871 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-qwrnk"] Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.469325 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/10f57a6c-ce50-4026-a330-b0a195528a92-metallb-excludel2\") pod \"speaker-vgclf\" (UID: \"10f57a6c-ce50-4026-a330-b0a195528a92\") " pod="metallb-system/speaker-vgclf" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.469376 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-reloader\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.469401 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-metrics\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.469423 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb6pw\" (UniqueName: \"kubernetes.io/projected/af18f5b9-8057-49c0-b0b0-d64a7fff5357-kube-api-access-nb6pw\") pod \"controller-f8648f98b-qwrnk\" (UID: \"af18f5b9-8057-49c0-b0b0-d64a7fff5357\") " pod="metallb-system/controller-f8648f98b-qwrnk" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.469444 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10f57a6c-ce50-4026-a330-b0a195528a92-metrics-certs\") pod \"speaker-vgclf\" (UID: \"10f57a6c-ce50-4026-a330-b0a195528a92\") " pod="metallb-system/speaker-vgclf" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.469458 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67gn9\" (UniqueName: \"kubernetes.io/projected/10f57a6c-ce50-4026-a330-b0a195528a92-kube-api-access-67gn9\") pod \"speaker-vgclf\" (UID: \"10f57a6c-ce50-4026-a330-b0a195528a92\") " pod="metallb-system/speaker-vgclf" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.469481 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af18f5b9-8057-49c0-b0b0-d64a7fff5357-metrics-certs\") pod \"controller-f8648f98b-qwrnk\" (UID: \"af18f5b9-8057-49c0-b0b0-d64a7fff5357\") " pod="metallb-system/controller-f8648f98b-qwrnk" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.469506 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/10f57a6c-ce50-4026-a330-b0a195528a92-memberlist\") pod \"speaker-vgclf\" (UID: \"10f57a6c-ce50-4026-a330-b0a195528a92\") " pod="metallb-system/speaker-vgclf" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.469531 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-frr-startup\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.469553 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-frr-sockets\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.469572 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-frr-conf\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.469587 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gngm2\" (UniqueName: \"kubernetes.io/projected/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-kube-api-access-gngm2\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.469603 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13a694d0-f9dd-46f3-84de-5b6c7b6a9e4f-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-4hldc\" (UID: \"13a694d0-f9dd-46f3-84de-5b6c7b6a9e4f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4hldc" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.469625 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af18f5b9-8057-49c0-b0b0-d64a7fff5357-cert\") pod \"controller-f8648f98b-qwrnk\" (UID: \"af18f5b9-8057-49c0-b0b0-d64a7fff5357\") " pod="metallb-system/controller-f8648f98b-qwrnk" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.469645 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqgjb\" (UniqueName: \"kubernetes.io/projected/13a694d0-f9dd-46f3-84de-5b6c7b6a9e4f-kube-api-access-qqgjb\") pod \"frr-k8s-webhook-server-7fcb986d4-4hldc\" (UID: \"13a694d0-f9dd-46f3-84de-5b6c7b6a9e4f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4hldc" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.469670 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-metrics-certs\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.470444 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-reloader\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.470665 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-metrics\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.471000 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-frr-sockets\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.471136 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-frr-conf\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.471482 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-frr-startup\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.476434 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-metrics-certs\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.485142 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13a694d0-f9dd-46f3-84de-5b6c7b6a9e4f-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-4hldc\" (UID: \"13a694d0-f9dd-46f3-84de-5b6c7b6a9e4f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4hldc" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.487656 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gngm2\" (UniqueName: \"kubernetes.io/projected/ed377bf7-d0c1-45d0-bad2-948f4bde39aa-kube-api-access-gngm2\") pod \"frr-k8s-d5xdx\" (UID: \"ed377bf7-d0c1-45d0-bad2-948f4bde39aa\") " pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.491115 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqgjb\" (UniqueName: \"kubernetes.io/projected/13a694d0-f9dd-46f3-84de-5b6c7b6a9e4f-kube-api-access-qqgjb\") pod \"frr-k8s-webhook-server-7fcb986d4-4hldc\" (UID: \"13a694d0-f9dd-46f3-84de-5b6c7b6a9e4f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4hldc" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.571393 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af18f5b9-8057-49c0-b0b0-d64a7fff5357-cert\") pod \"controller-f8648f98b-qwrnk\" (UID: \"af18f5b9-8057-49c0-b0b0-d64a7fff5357\") " pod="metallb-system/controller-f8648f98b-qwrnk" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.571770 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/10f57a6c-ce50-4026-a330-b0a195528a92-metallb-excludel2\") pod \"speaker-vgclf\" (UID: \"10f57a6c-ce50-4026-a330-b0a195528a92\") " pod="metallb-system/speaker-vgclf" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.572821 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb6pw\" (UniqueName: \"kubernetes.io/projected/af18f5b9-8057-49c0-b0b0-d64a7fff5357-kube-api-access-nb6pw\") pod \"controller-f8648f98b-qwrnk\" (UID: \"af18f5b9-8057-49c0-b0b0-d64a7fff5357\") " pod="metallb-system/controller-f8648f98b-qwrnk" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.573226 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.572761 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/10f57a6c-ce50-4026-a330-b0a195528a92-metallb-excludel2\") pod \"speaker-vgclf\" (UID: \"10f57a6c-ce50-4026-a330-b0a195528a92\") " pod="metallb-system/speaker-vgclf" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.573241 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10f57a6c-ce50-4026-a330-b0a195528a92-metrics-certs\") pod \"speaker-vgclf\" (UID: \"10f57a6c-ce50-4026-a330-b0a195528a92\") " pod="metallb-system/speaker-vgclf" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.573600 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67gn9\" (UniqueName: \"kubernetes.io/projected/10f57a6c-ce50-4026-a330-b0a195528a92-kube-api-access-67gn9\") pod \"speaker-vgclf\" (UID: \"10f57a6c-ce50-4026-a330-b0a195528a92\") " pod="metallb-system/speaker-vgclf" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.573707 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af18f5b9-8057-49c0-b0b0-d64a7fff5357-metrics-certs\") pod \"controller-f8648f98b-qwrnk\" (UID: \"af18f5b9-8057-49c0-b0b0-d64a7fff5357\") " pod="metallb-system/controller-f8648f98b-qwrnk" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.573844 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/10f57a6c-ce50-4026-a330-b0a195528a92-memberlist\") pod \"speaker-vgclf\" (UID: \"10f57a6c-ce50-4026-a330-b0a195528a92\") " pod="metallb-system/speaker-vgclf" Dec 02 20:15:12 crc kubenswrapper[4807]: E1202 20:15:12.574008 4807 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 20:15:12 crc kubenswrapper[4807]: E1202 20:15:12.574106 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10f57a6c-ce50-4026-a330-b0a195528a92-memberlist podName:10f57a6c-ce50-4026-a330-b0a195528a92 nodeName:}" failed. No retries permitted until 2025-12-02 20:15:13.074082069 +0000 UTC m=+1048.374989564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/10f57a6c-ce50-4026-a330-b0a195528a92-memberlist") pod "speaker-vgclf" (UID: "10f57a6c-ce50-4026-a330-b0a195528a92") : secret "metallb-memberlist" not found Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.577592 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10f57a6c-ce50-4026-a330-b0a195528a92-metrics-certs\") pod \"speaker-vgclf\" (UID: \"10f57a6c-ce50-4026-a330-b0a195528a92\") " pod="metallb-system/speaker-vgclf" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.579699 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af18f5b9-8057-49c0-b0b0-d64a7fff5357-metrics-certs\") pod \"controller-f8648f98b-qwrnk\" (UID: \"af18f5b9-8057-49c0-b0b0-d64a7fff5357\") " pod="metallb-system/controller-f8648f98b-qwrnk" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.584113 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.591638 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb6pw\" (UniqueName: \"kubernetes.io/projected/af18f5b9-8057-49c0-b0b0-d64a7fff5357-kube-api-access-nb6pw\") pod \"controller-f8648f98b-qwrnk\" (UID: \"af18f5b9-8057-49c0-b0b0-d64a7fff5357\") " pod="metallb-system/controller-f8648f98b-qwrnk" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.592597 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4hldc" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.593594 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af18f5b9-8057-49c0-b0b0-d64a7fff5357-cert\") pod \"controller-f8648f98b-qwrnk\" (UID: \"af18f5b9-8057-49c0-b0b0-d64a7fff5357\") " pod="metallb-system/controller-f8648f98b-qwrnk" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.594922 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67gn9\" (UniqueName: \"kubernetes.io/projected/10f57a6c-ce50-4026-a330-b0a195528a92-kube-api-access-67gn9\") pod \"speaker-vgclf\" (UID: \"10f57a6c-ce50-4026-a330-b0a195528a92\") " pod="metallb-system/speaker-vgclf" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.689515 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-qwrnk" Dec 02 20:15:12 crc kubenswrapper[4807]: I1202 20:15:12.921110 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-4hldc"] Dec 02 20:15:13 crc kubenswrapper[4807]: I1202 20:15:13.094019 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/10f57a6c-ce50-4026-a330-b0a195528a92-memberlist\") pod \"speaker-vgclf\" (UID: \"10f57a6c-ce50-4026-a330-b0a195528a92\") " pod="metallb-system/speaker-vgclf" Dec 02 20:15:13 crc kubenswrapper[4807]: E1202 20:15:13.094171 4807 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 20:15:13 crc kubenswrapper[4807]: E1202 20:15:13.094349 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10f57a6c-ce50-4026-a330-b0a195528a92-memberlist podName:10f57a6c-ce50-4026-a330-b0a195528a92 nodeName:}" failed. No retries permitted until 2025-12-02 20:15:14.094326887 +0000 UTC m=+1049.395234382 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/10f57a6c-ce50-4026-a330-b0a195528a92-memberlist") pod "speaker-vgclf" (UID: "10f57a6c-ce50-4026-a330-b0a195528a92") : secret "metallb-memberlist" not found Dec 02 20:15:13 crc kubenswrapper[4807]: I1202 20:15:13.252965 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-qwrnk"] Dec 02 20:15:13 crc kubenswrapper[4807]: W1202 20:15:13.256951 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf18f5b9_8057_49c0_b0b0_d64a7fff5357.slice/crio-78d23508e478b3837499cb1298f414825e4afde7b5d4b429e63dc86d3b550b2b WatchSource:0}: Error finding container 78d23508e478b3837499cb1298f414825e4afde7b5d4b429e63dc86d3b550b2b: Status 404 returned error can't find the container with id 78d23508e478b3837499cb1298f414825e4afde7b5d4b429e63dc86d3b550b2b Dec 02 20:15:13 crc kubenswrapper[4807]: I1202 20:15:13.365619 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-qwrnk" event={"ID":"af18f5b9-8057-49c0-b0b0-d64a7fff5357","Type":"ContainerStarted","Data":"78d23508e478b3837499cb1298f414825e4afde7b5d4b429e63dc86d3b550b2b"} Dec 02 20:15:13 crc kubenswrapper[4807]: I1202 20:15:13.366960 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-d5xdx" event={"ID":"ed377bf7-d0c1-45d0-bad2-948f4bde39aa","Type":"ContainerStarted","Data":"5294e754704b3c895f539a6a90d63936c2d07788f2ab1e4a9b366c696339f8c2"} Dec 02 20:15:13 crc kubenswrapper[4807]: I1202 20:15:13.368154 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4hldc" event={"ID":"13a694d0-f9dd-46f3-84de-5b6c7b6a9e4f","Type":"ContainerStarted","Data":"a570b1aaa1c39d48fc948cc5e38a5f4ef30dbd1b91279d1f752d85b1d1f8e3ac"} Dec 02 20:15:14 crc kubenswrapper[4807]: I1202 20:15:14.110391 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/10f57a6c-ce50-4026-a330-b0a195528a92-memberlist\") pod \"speaker-vgclf\" (UID: \"10f57a6c-ce50-4026-a330-b0a195528a92\") " pod="metallb-system/speaker-vgclf" Dec 02 20:15:14 crc kubenswrapper[4807]: I1202 20:15:14.116827 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/10f57a6c-ce50-4026-a330-b0a195528a92-memberlist\") pod \"speaker-vgclf\" (UID: \"10f57a6c-ce50-4026-a330-b0a195528a92\") " pod="metallb-system/speaker-vgclf" Dec 02 20:15:14 crc kubenswrapper[4807]: I1202 20:15:14.172675 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vgclf" Dec 02 20:15:14 crc kubenswrapper[4807]: W1202 20:15:14.218086 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10f57a6c_ce50_4026_a330_b0a195528a92.slice/crio-e9d68ae34c22e48319f751b126cc562192db73faa7ef3658fe71f548bca5f493 WatchSource:0}: Error finding container e9d68ae34c22e48319f751b126cc562192db73faa7ef3658fe71f548bca5f493: Status 404 returned error can't find the container with id e9d68ae34c22e48319f751b126cc562192db73faa7ef3658fe71f548bca5f493 Dec 02 20:15:14 crc kubenswrapper[4807]: I1202 20:15:14.417025 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-qwrnk" event={"ID":"af18f5b9-8057-49c0-b0b0-d64a7fff5357","Type":"ContainerStarted","Data":"89416fe1c903cc4c048ca6bc0d7dadf25bbe0d30dfc414960ca6da7d44860a85"} Dec 02 20:15:14 crc kubenswrapper[4807]: I1202 20:15:14.417110 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-qwrnk" event={"ID":"af18f5b9-8057-49c0-b0b0-d64a7fff5357","Type":"ContainerStarted","Data":"31910ac31e1ee3236ae367684fce3c196313fc0999f7aad77977574652a189a8"} Dec 02 20:15:14 crc kubenswrapper[4807]: I1202 20:15:14.418011 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-qwrnk" Dec 02 20:15:14 crc kubenswrapper[4807]: I1202 20:15:14.420857 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vgclf" event={"ID":"10f57a6c-ce50-4026-a330-b0a195528a92","Type":"ContainerStarted","Data":"e9d68ae34c22e48319f751b126cc562192db73faa7ef3658fe71f548bca5f493"} Dec 02 20:15:14 crc kubenswrapper[4807]: I1202 20:15:14.455199 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-qwrnk" podStartSLOduration=2.455169605 podStartE2EDuration="2.455169605s" podCreationTimestamp="2025-12-02 20:15:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:15:14.446339087 +0000 UTC m=+1049.747246582" watchObservedRunningTime="2025-12-02 20:15:14.455169605 +0000 UTC m=+1049.756077100" Dec 02 20:15:15 crc kubenswrapper[4807]: I1202 20:15:15.433445 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vgclf" event={"ID":"10f57a6c-ce50-4026-a330-b0a195528a92","Type":"ContainerStarted","Data":"3fc3a14cee9773d6b11f352d7337be568599a49fd77aa3e304b8aef91a72ffa2"} Dec 02 20:15:15 crc kubenswrapper[4807]: I1202 20:15:15.433900 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-vgclf" Dec 02 20:15:15 crc kubenswrapper[4807]: I1202 20:15:15.433952 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vgclf" event={"ID":"10f57a6c-ce50-4026-a330-b0a195528a92","Type":"ContainerStarted","Data":"ab4e44eaa4dcdb874e0adab291169b9d8fd35dcc4f416f7980c06286bec410dc"} Dec 02 20:15:15 crc kubenswrapper[4807]: I1202 20:15:15.465279 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-vgclf" podStartSLOduration=3.465256602 podStartE2EDuration="3.465256602s" podCreationTimestamp="2025-12-02 20:15:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:15:15.459881135 +0000 UTC m=+1050.760788630" watchObservedRunningTime="2025-12-02 20:15:15.465256602 +0000 UTC m=+1050.766164097" Dec 02 20:15:21 crc kubenswrapper[4807]: I1202 20:15:21.478540 4807 generic.go:334] "Generic (PLEG): container finished" podID="ed377bf7-d0c1-45d0-bad2-948f4bde39aa" containerID="f6843db0db454786c3ec677d8f87bf27a3b91bc679c7b3e423c5ac50a3c75fe3" exitCode=0 Dec 02 20:15:21 crc kubenswrapper[4807]: I1202 20:15:21.478651 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-d5xdx" event={"ID":"ed377bf7-d0c1-45d0-bad2-948f4bde39aa","Type":"ContainerDied","Data":"f6843db0db454786c3ec677d8f87bf27a3b91bc679c7b3e423c5ac50a3c75fe3"} Dec 02 20:15:21 crc kubenswrapper[4807]: I1202 20:15:21.481885 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4hldc" event={"ID":"13a694d0-f9dd-46f3-84de-5b6c7b6a9e4f","Type":"ContainerStarted","Data":"d25db97ade3fdef12eb603d146aec1d10da6619a7eafd1d990ab798d3dedc407"} Dec 02 20:15:21 crc kubenswrapper[4807]: I1202 20:15:21.481964 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4hldc" Dec 02 20:15:21 crc kubenswrapper[4807]: I1202 20:15:21.535542 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4hldc" podStartSLOduration=1.5996077180000001 podStartE2EDuration="9.535520668s" podCreationTimestamp="2025-12-02 20:15:12 +0000 UTC" firstStartedPulling="2025-12-02 20:15:12.943382922 +0000 UTC m=+1048.244290417" lastFinishedPulling="2025-12-02 20:15:20.879295832 +0000 UTC m=+1056.180203367" observedRunningTime="2025-12-02 20:15:21.534714225 +0000 UTC m=+1056.835621760" watchObservedRunningTime="2025-12-02 20:15:21.535520668 +0000 UTC m=+1056.836428163" Dec 02 20:15:22 crc kubenswrapper[4807]: I1202 20:15:22.490549 4807 generic.go:334] "Generic (PLEG): container finished" podID="ed377bf7-d0c1-45d0-bad2-948f4bde39aa" containerID="9eb712df5634f7c73af400d430dbcc253ac9b70a41bd2a7ac94eaae86e77a934" exitCode=0 Dec 02 20:15:22 crc kubenswrapper[4807]: I1202 20:15:22.490641 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-d5xdx" event={"ID":"ed377bf7-d0c1-45d0-bad2-948f4bde39aa","Type":"ContainerDied","Data":"9eb712df5634f7c73af400d430dbcc253ac9b70a41bd2a7ac94eaae86e77a934"} Dec 02 20:15:23 crc kubenswrapper[4807]: I1202 20:15:23.505473 4807 generic.go:334] "Generic (PLEG): container finished" podID="ed377bf7-d0c1-45d0-bad2-948f4bde39aa" containerID="7afcb91b9c0118aec2710a5db8d4c555ccc720192ebbcbb3a0602fc98bc37588" exitCode=0 Dec 02 20:15:23 crc kubenswrapper[4807]: I1202 20:15:23.505631 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-d5xdx" event={"ID":"ed377bf7-d0c1-45d0-bad2-948f4bde39aa","Type":"ContainerDied","Data":"7afcb91b9c0118aec2710a5db8d4c555ccc720192ebbcbb3a0602fc98bc37588"} Dec 02 20:15:24 crc kubenswrapper[4807]: I1202 20:15:24.178212 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-vgclf" Dec 02 20:15:24 crc kubenswrapper[4807]: I1202 20:15:24.518049 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-d5xdx" event={"ID":"ed377bf7-d0c1-45d0-bad2-948f4bde39aa","Type":"ContainerStarted","Data":"d421f7da2412cd1c83c7542f939fc5352ab32ac9a005a0f4790605cb12db32b0"} Dec 02 20:15:24 crc kubenswrapper[4807]: I1202 20:15:24.518115 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-d5xdx" event={"ID":"ed377bf7-d0c1-45d0-bad2-948f4bde39aa","Type":"ContainerStarted","Data":"5d27fdf43a11a188d35cfe8e13242af3dd4fd458e312e569d4c44b9cf629a6f1"} Dec 02 20:15:24 crc kubenswrapper[4807]: I1202 20:15:24.518134 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-d5xdx" event={"ID":"ed377bf7-d0c1-45d0-bad2-948f4bde39aa","Type":"ContainerStarted","Data":"6c4099f0c37683fe60692164683f5f5ea4eb3adfcd7142acb6a76de582ab6099"} Dec 02 20:15:24 crc kubenswrapper[4807]: I1202 20:15:24.518150 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-d5xdx" event={"ID":"ed377bf7-d0c1-45d0-bad2-948f4bde39aa","Type":"ContainerStarted","Data":"a53dac8fe06a139925beaff32d686b634f354e7ce3191f607f9eb3b3456c6e62"} Dec 02 20:15:24 crc kubenswrapper[4807]: I1202 20:15:24.518165 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-d5xdx" event={"ID":"ed377bf7-d0c1-45d0-bad2-948f4bde39aa","Type":"ContainerStarted","Data":"e73a66b3a1a2dc5aff75c86f0cf926d18a9893b82cabcc369795a6abc43f57f8"} Dec 02 20:15:25 crc kubenswrapper[4807]: I1202 20:15:25.532647 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-d5xdx" event={"ID":"ed377bf7-d0c1-45d0-bad2-948f4bde39aa","Type":"ContainerStarted","Data":"e61d54655619a6ddd18e1b7676ebab1ea5cade4524f396cb15c1acbf4326d8a5"} Dec 02 20:15:25 crc kubenswrapper[4807]: I1202 20:15:25.532899 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:25 crc kubenswrapper[4807]: I1202 20:15:25.562577 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-d5xdx" podStartSLOduration=5.557254296 podStartE2EDuration="13.562545136s" podCreationTimestamp="2025-12-02 20:15:12 +0000 UTC" firstStartedPulling="2025-12-02 20:15:12.87254825 +0000 UTC m=+1048.173455745" lastFinishedPulling="2025-12-02 20:15:20.87783908 +0000 UTC m=+1056.178746585" observedRunningTime="2025-12-02 20:15:25.555035597 +0000 UTC m=+1060.855943112" watchObservedRunningTime="2025-12-02 20:15:25.562545136 +0000 UTC m=+1060.863452651" Dec 02 20:15:27 crc kubenswrapper[4807]: I1202 20:15:27.318946 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mmwc6"] Dec 02 20:15:27 crc kubenswrapper[4807]: I1202 20:15:27.321410 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mmwc6" Dec 02 20:15:27 crc kubenswrapper[4807]: I1202 20:15:27.345574 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-txw9g" Dec 02 20:15:27 crc kubenswrapper[4807]: I1202 20:15:27.345916 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 02 20:15:27 crc kubenswrapper[4807]: I1202 20:15:27.354543 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 02 20:15:27 crc kubenswrapper[4807]: I1202 20:15:27.370067 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mmwc6"] Dec 02 20:15:27 crc kubenswrapper[4807]: I1202 20:15:27.439662 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz67g\" (UniqueName: \"kubernetes.io/projected/267955c6-57a8-49ec-aac1-0106fed3dfc8-kube-api-access-sz67g\") pod \"openstack-operator-index-mmwc6\" (UID: \"267955c6-57a8-49ec-aac1-0106fed3dfc8\") " pod="openstack-operators/openstack-operator-index-mmwc6" Dec 02 20:15:27 crc kubenswrapper[4807]: I1202 20:15:27.541036 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz67g\" (UniqueName: \"kubernetes.io/projected/267955c6-57a8-49ec-aac1-0106fed3dfc8-kube-api-access-sz67g\") pod \"openstack-operator-index-mmwc6\" (UID: \"267955c6-57a8-49ec-aac1-0106fed3dfc8\") " pod="openstack-operators/openstack-operator-index-mmwc6" Dec 02 20:15:27 crc kubenswrapper[4807]: I1202 20:15:27.572835 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz67g\" (UniqueName: \"kubernetes.io/projected/267955c6-57a8-49ec-aac1-0106fed3dfc8-kube-api-access-sz67g\") pod \"openstack-operator-index-mmwc6\" (UID: \"267955c6-57a8-49ec-aac1-0106fed3dfc8\") " pod="openstack-operators/openstack-operator-index-mmwc6" Dec 02 20:15:27 crc kubenswrapper[4807]: I1202 20:15:27.585319 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:27 crc kubenswrapper[4807]: I1202 20:15:27.623831 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:27 crc kubenswrapper[4807]: I1202 20:15:27.680150 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mmwc6" Dec 02 20:15:28 crc kubenswrapper[4807]: I1202 20:15:28.016777 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mmwc6"] Dec 02 20:15:28 crc kubenswrapper[4807]: I1202 20:15:28.556148 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mmwc6" event={"ID":"267955c6-57a8-49ec-aac1-0106fed3dfc8","Type":"ContainerStarted","Data":"c8bbb4c98a0cce0299cff5064d3a58f2b0c82f31b1915b9cf7bc3c5d2391157f"} Dec 02 20:15:30 crc kubenswrapper[4807]: I1202 20:15:30.673905 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mmwc6"] Dec 02 20:15:31 crc kubenswrapper[4807]: I1202 20:15:31.484009 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-h8bl8"] Dec 02 20:15:31 crc kubenswrapper[4807]: I1202 20:15:31.485585 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h8bl8" Dec 02 20:15:31 crc kubenswrapper[4807]: I1202 20:15:31.490967 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-h8bl8"] Dec 02 20:15:31 crc kubenswrapper[4807]: I1202 20:15:31.603413 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njrpn\" (UniqueName: \"kubernetes.io/projected/1f433a73-95ba-41cc-9f6e-3c6b26dd5e50-kube-api-access-njrpn\") pod \"openstack-operator-index-h8bl8\" (UID: \"1f433a73-95ba-41cc-9f6e-3c6b26dd5e50\") " pod="openstack-operators/openstack-operator-index-h8bl8" Dec 02 20:15:31 crc kubenswrapper[4807]: I1202 20:15:31.705465 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njrpn\" (UniqueName: \"kubernetes.io/projected/1f433a73-95ba-41cc-9f6e-3c6b26dd5e50-kube-api-access-njrpn\") pod \"openstack-operator-index-h8bl8\" (UID: \"1f433a73-95ba-41cc-9f6e-3c6b26dd5e50\") " pod="openstack-operators/openstack-operator-index-h8bl8" Dec 02 20:15:31 crc kubenswrapper[4807]: I1202 20:15:31.730208 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njrpn\" (UniqueName: \"kubernetes.io/projected/1f433a73-95ba-41cc-9f6e-3c6b26dd5e50-kube-api-access-njrpn\") pod \"openstack-operator-index-h8bl8\" (UID: \"1f433a73-95ba-41cc-9f6e-3c6b26dd5e50\") " pod="openstack-operators/openstack-operator-index-h8bl8" Dec 02 20:15:31 crc kubenswrapper[4807]: I1202 20:15:31.807042 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-h8bl8" Dec 02 20:15:32 crc kubenswrapper[4807]: I1202 20:15:32.270122 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-h8bl8"] Dec 02 20:15:32 crc kubenswrapper[4807]: W1202 20:15:32.275115 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f433a73_95ba_41cc_9f6e_3c6b26dd5e50.slice/crio-16a45789c6b69c5d304763eca6a1660fc8dbea9902f2221ce8c668703170f807 WatchSource:0}: Error finding container 16a45789c6b69c5d304763eca6a1660fc8dbea9902f2221ce8c668703170f807: Status 404 returned error can't find the container with id 16a45789c6b69c5d304763eca6a1660fc8dbea9902f2221ce8c668703170f807 Dec 02 20:15:32 crc kubenswrapper[4807]: I1202 20:15:32.588410 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h8bl8" event={"ID":"1f433a73-95ba-41cc-9f6e-3c6b26dd5e50","Type":"ContainerStarted","Data":"16a45789c6b69c5d304763eca6a1660fc8dbea9902f2221ce8c668703170f807"} Dec 02 20:15:32 crc kubenswrapper[4807]: I1202 20:15:32.590546 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mmwc6" event={"ID":"267955c6-57a8-49ec-aac1-0106fed3dfc8","Type":"ContainerStarted","Data":"1420d7337c053161e0cd6ebe663c002ea88f91d43fa3edb58aec888826401776"} Dec 02 20:15:32 crc kubenswrapper[4807]: I1202 20:15:32.590760 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-mmwc6" podUID="267955c6-57a8-49ec-aac1-0106fed3dfc8" containerName="registry-server" containerID="cri-o://1420d7337c053161e0cd6ebe663c002ea88f91d43fa3edb58aec888826401776" gracePeriod=2 Dec 02 20:15:32 crc kubenswrapper[4807]: I1202 20:15:32.603611 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-4hldc" Dec 02 20:15:32 crc kubenswrapper[4807]: I1202 20:15:32.615481 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mmwc6" podStartSLOduration=2.462300601 podStartE2EDuration="5.615448056s" podCreationTimestamp="2025-12-02 20:15:27 +0000 UTC" firstStartedPulling="2025-12-02 20:15:28.028666304 +0000 UTC m=+1063.329573799" lastFinishedPulling="2025-12-02 20:15:31.181813739 +0000 UTC m=+1066.482721254" observedRunningTime="2025-12-02 20:15:32.612339636 +0000 UTC m=+1067.913247161" watchObservedRunningTime="2025-12-02 20:15:32.615448056 +0000 UTC m=+1067.916355581" Dec 02 20:15:32 crc kubenswrapper[4807]: I1202 20:15:32.695430 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-qwrnk" Dec 02 20:15:32 crc kubenswrapper[4807]: I1202 20:15:32.979392 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mmwc6" Dec 02 20:15:33 crc kubenswrapper[4807]: I1202 20:15:33.028557 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz67g\" (UniqueName: \"kubernetes.io/projected/267955c6-57a8-49ec-aac1-0106fed3dfc8-kube-api-access-sz67g\") pod \"267955c6-57a8-49ec-aac1-0106fed3dfc8\" (UID: \"267955c6-57a8-49ec-aac1-0106fed3dfc8\") " Dec 02 20:15:33 crc kubenswrapper[4807]: I1202 20:15:33.036478 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267955c6-57a8-49ec-aac1-0106fed3dfc8-kube-api-access-sz67g" (OuterVolumeSpecName: "kube-api-access-sz67g") pod "267955c6-57a8-49ec-aac1-0106fed3dfc8" (UID: "267955c6-57a8-49ec-aac1-0106fed3dfc8"). InnerVolumeSpecName "kube-api-access-sz67g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:15:33 crc kubenswrapper[4807]: I1202 20:15:33.130697 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz67g\" (UniqueName: \"kubernetes.io/projected/267955c6-57a8-49ec-aac1-0106fed3dfc8-kube-api-access-sz67g\") on node \"crc\" DevicePath \"\"" Dec 02 20:15:33 crc kubenswrapper[4807]: I1202 20:15:33.599313 4807 generic.go:334] "Generic (PLEG): container finished" podID="267955c6-57a8-49ec-aac1-0106fed3dfc8" containerID="1420d7337c053161e0cd6ebe663c002ea88f91d43fa3edb58aec888826401776" exitCode=0 Dec 02 20:15:33 crc kubenswrapper[4807]: I1202 20:15:33.599380 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mmwc6" Dec 02 20:15:33 crc kubenswrapper[4807]: I1202 20:15:33.599444 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mmwc6" event={"ID":"267955c6-57a8-49ec-aac1-0106fed3dfc8","Type":"ContainerDied","Data":"1420d7337c053161e0cd6ebe663c002ea88f91d43fa3edb58aec888826401776"} Dec 02 20:15:33 crc kubenswrapper[4807]: I1202 20:15:33.599483 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mmwc6" event={"ID":"267955c6-57a8-49ec-aac1-0106fed3dfc8","Type":"ContainerDied","Data":"c8bbb4c98a0cce0299cff5064d3a58f2b0c82f31b1915b9cf7bc3c5d2391157f"} Dec 02 20:15:33 crc kubenswrapper[4807]: I1202 20:15:33.599507 4807 scope.go:117] "RemoveContainer" containerID="1420d7337c053161e0cd6ebe663c002ea88f91d43fa3edb58aec888826401776" Dec 02 20:15:33 crc kubenswrapper[4807]: I1202 20:15:33.601682 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-h8bl8" event={"ID":"1f433a73-95ba-41cc-9f6e-3c6b26dd5e50","Type":"ContainerStarted","Data":"93cfa08998368c145f88a2d6a13f7da331a66543370eef6741393c01f69a42bc"} Dec 02 20:15:33 crc kubenswrapper[4807]: I1202 20:15:33.620625 4807 scope.go:117] "RemoveContainer" containerID="1420d7337c053161e0cd6ebe663c002ea88f91d43fa3edb58aec888826401776" Dec 02 20:15:33 crc kubenswrapper[4807]: E1202 20:15:33.621207 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1420d7337c053161e0cd6ebe663c002ea88f91d43fa3edb58aec888826401776\": container with ID starting with 1420d7337c053161e0cd6ebe663c002ea88f91d43fa3edb58aec888826401776 not found: ID does not exist" containerID="1420d7337c053161e0cd6ebe663c002ea88f91d43fa3edb58aec888826401776" Dec 02 20:15:33 crc kubenswrapper[4807]: I1202 20:15:33.621251 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1420d7337c053161e0cd6ebe663c002ea88f91d43fa3edb58aec888826401776"} err="failed to get container status \"1420d7337c053161e0cd6ebe663c002ea88f91d43fa3edb58aec888826401776\": rpc error: code = NotFound desc = could not find container \"1420d7337c053161e0cd6ebe663c002ea88f91d43fa3edb58aec888826401776\": container with ID starting with 1420d7337c053161e0cd6ebe663c002ea88f91d43fa3edb58aec888826401776 not found: ID does not exist" Dec 02 20:15:33 crc kubenswrapper[4807]: I1202 20:15:33.626837 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-h8bl8" podStartSLOduration=2.28323863 podStartE2EDuration="2.626820271s" podCreationTimestamp="2025-12-02 20:15:31 +0000 UTC" firstStartedPulling="2025-12-02 20:15:32.281406535 +0000 UTC m=+1067.582314050" lastFinishedPulling="2025-12-02 20:15:32.624988156 +0000 UTC m=+1067.925895691" observedRunningTime="2025-12-02 20:15:33.623656489 +0000 UTC m=+1068.924564004" watchObservedRunningTime="2025-12-02 20:15:33.626820271 +0000 UTC m=+1068.927727766" Dec 02 20:15:33 crc kubenswrapper[4807]: I1202 20:15:33.643990 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mmwc6"] Dec 02 20:15:33 crc kubenswrapper[4807]: I1202 20:15:33.649510 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-mmwc6"] Dec 02 20:15:34 crc kubenswrapper[4807]: I1202 20:15:34.997046 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="267955c6-57a8-49ec-aac1-0106fed3dfc8" path="/var/lib/kubelet/pods/267955c6-57a8-49ec-aac1-0106fed3dfc8/volumes" Dec 02 20:15:41 crc kubenswrapper[4807]: I1202 20:15:41.807922 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-h8bl8" Dec 02 20:15:41 crc kubenswrapper[4807]: I1202 20:15:41.808102 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-h8bl8" Dec 02 20:15:41 crc kubenswrapper[4807]: I1202 20:15:41.852911 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-h8bl8" Dec 02 20:15:42 crc kubenswrapper[4807]: I1202 20:15:42.589844 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-d5xdx" Dec 02 20:15:42 crc kubenswrapper[4807]: I1202 20:15:42.727208 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-h8bl8" Dec 02 20:16:03 crc kubenswrapper[4807]: I1202 20:16:03.752575 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd"] Dec 02 20:16:03 crc kubenswrapper[4807]: E1202 20:16:03.753544 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267955c6-57a8-49ec-aac1-0106fed3dfc8" containerName="registry-server" Dec 02 20:16:03 crc kubenswrapper[4807]: I1202 20:16:03.753560 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="267955c6-57a8-49ec-aac1-0106fed3dfc8" containerName="registry-server" Dec 02 20:16:03 crc kubenswrapper[4807]: I1202 20:16:03.753682 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="267955c6-57a8-49ec-aac1-0106fed3dfc8" containerName="registry-server" Dec 02 20:16:03 crc kubenswrapper[4807]: I1202 20:16:03.754571 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" Dec 02 20:16:03 crc kubenswrapper[4807]: I1202 20:16:03.762877 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-brstf" Dec 02 20:16:03 crc kubenswrapper[4807]: I1202 20:16:03.764023 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd"] Dec 02 20:16:03 crc kubenswrapper[4807]: I1202 20:16:03.818442 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94bd7c70-1bb7-4b0e-816e-5be2df3641da-bundle\") pod \"9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd\" (UID: \"94bd7c70-1bb7-4b0e-816e-5be2df3641da\") " pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" Dec 02 20:16:03 crc kubenswrapper[4807]: I1202 20:16:03.818518 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8vw7\" (UniqueName: \"kubernetes.io/projected/94bd7c70-1bb7-4b0e-816e-5be2df3641da-kube-api-access-m8vw7\") pod \"9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd\" (UID: \"94bd7c70-1bb7-4b0e-816e-5be2df3641da\") " pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" Dec 02 20:16:03 crc kubenswrapper[4807]: I1202 20:16:03.818583 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94bd7c70-1bb7-4b0e-816e-5be2df3641da-util\") pod \"9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd\" (UID: \"94bd7c70-1bb7-4b0e-816e-5be2df3641da\") " pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" Dec 02 20:16:03 crc kubenswrapper[4807]: I1202 20:16:03.920538 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94bd7c70-1bb7-4b0e-816e-5be2df3641da-util\") pod \"9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd\" (UID: \"94bd7c70-1bb7-4b0e-816e-5be2df3641da\") " pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" Dec 02 20:16:03 crc kubenswrapper[4807]: I1202 20:16:03.920681 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94bd7c70-1bb7-4b0e-816e-5be2df3641da-bundle\") pod \"9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd\" (UID: \"94bd7c70-1bb7-4b0e-816e-5be2df3641da\") " pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" Dec 02 20:16:03 crc kubenswrapper[4807]: I1202 20:16:03.920807 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8vw7\" (UniqueName: \"kubernetes.io/projected/94bd7c70-1bb7-4b0e-816e-5be2df3641da-kube-api-access-m8vw7\") pod \"9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd\" (UID: \"94bd7c70-1bb7-4b0e-816e-5be2df3641da\") " pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" Dec 02 20:16:03 crc kubenswrapper[4807]: I1202 20:16:03.921465 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94bd7c70-1bb7-4b0e-816e-5be2df3641da-util\") pod \"9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd\" (UID: \"94bd7c70-1bb7-4b0e-816e-5be2df3641da\") " pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" Dec 02 20:16:03 crc kubenswrapper[4807]: I1202 20:16:03.921630 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94bd7c70-1bb7-4b0e-816e-5be2df3641da-bundle\") pod \"9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd\" (UID: \"94bd7c70-1bb7-4b0e-816e-5be2df3641da\") " pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" Dec 02 20:16:03 crc kubenswrapper[4807]: I1202 20:16:03.956640 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8vw7\" (UniqueName: \"kubernetes.io/projected/94bd7c70-1bb7-4b0e-816e-5be2df3641da-kube-api-access-m8vw7\") pod \"9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd\" (UID: \"94bd7c70-1bb7-4b0e-816e-5be2df3641da\") " pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" Dec 02 20:16:04 crc kubenswrapper[4807]: I1202 20:16:04.078300 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" Dec 02 20:16:04 crc kubenswrapper[4807]: I1202 20:16:04.557536 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd"] Dec 02 20:16:04 crc kubenswrapper[4807]: I1202 20:16:04.864384 4807 generic.go:334] "Generic (PLEG): container finished" podID="94bd7c70-1bb7-4b0e-816e-5be2df3641da" containerID="177bf37136758b632941a1da9b972d51ad43742b757d7b1dd7effbca73378f38" exitCode=0 Dec 02 20:16:04 crc kubenswrapper[4807]: I1202 20:16:04.864530 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" event={"ID":"94bd7c70-1bb7-4b0e-816e-5be2df3641da","Type":"ContainerDied","Data":"177bf37136758b632941a1da9b972d51ad43742b757d7b1dd7effbca73378f38"} Dec 02 20:16:04 crc kubenswrapper[4807]: I1202 20:16:04.864927 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" event={"ID":"94bd7c70-1bb7-4b0e-816e-5be2df3641da","Type":"ContainerStarted","Data":"48449941a76850f700d2fc9ba8e9e558dfc1c12c7eb983401c22d380540fa47b"} Dec 02 20:16:04 crc kubenswrapper[4807]: I1202 20:16:04.866567 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 20:16:05 crc kubenswrapper[4807]: I1202 20:16:05.873947 4807 generic.go:334] "Generic (PLEG): container finished" podID="94bd7c70-1bb7-4b0e-816e-5be2df3641da" containerID="2d402b5237679cb9a3c24eaddb6c69311f2302790bad1454d76b44f1cf176df1" exitCode=0 Dec 02 20:16:05 crc kubenswrapper[4807]: I1202 20:16:05.874090 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" event={"ID":"94bd7c70-1bb7-4b0e-816e-5be2df3641da","Type":"ContainerDied","Data":"2d402b5237679cb9a3c24eaddb6c69311f2302790bad1454d76b44f1cf176df1"} Dec 02 20:16:06 crc kubenswrapper[4807]: I1202 20:16:06.886985 4807 generic.go:334] "Generic (PLEG): container finished" podID="94bd7c70-1bb7-4b0e-816e-5be2df3641da" containerID="c60066b685dca2e1b1a15fef13b7bae0566c7edee63cbe9dd7091530825ddbb7" exitCode=0 Dec 02 20:16:06 crc kubenswrapper[4807]: I1202 20:16:06.887104 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" event={"ID":"94bd7c70-1bb7-4b0e-816e-5be2df3641da","Type":"ContainerDied","Data":"c60066b685dca2e1b1a15fef13b7bae0566c7edee63cbe9dd7091530825ddbb7"} Dec 02 20:16:08 crc kubenswrapper[4807]: I1202 20:16:08.222514 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" Dec 02 20:16:08 crc kubenswrapper[4807]: I1202 20:16:08.288357 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94bd7c70-1bb7-4b0e-816e-5be2df3641da-bundle\") pod \"94bd7c70-1bb7-4b0e-816e-5be2df3641da\" (UID: \"94bd7c70-1bb7-4b0e-816e-5be2df3641da\") " Dec 02 20:16:08 crc kubenswrapper[4807]: I1202 20:16:08.288440 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8vw7\" (UniqueName: \"kubernetes.io/projected/94bd7c70-1bb7-4b0e-816e-5be2df3641da-kube-api-access-m8vw7\") pod \"94bd7c70-1bb7-4b0e-816e-5be2df3641da\" (UID: \"94bd7c70-1bb7-4b0e-816e-5be2df3641da\") " Dec 02 20:16:08 crc kubenswrapper[4807]: I1202 20:16:08.288469 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94bd7c70-1bb7-4b0e-816e-5be2df3641da-util\") pod \"94bd7c70-1bb7-4b0e-816e-5be2df3641da\" (UID: \"94bd7c70-1bb7-4b0e-816e-5be2df3641da\") " Dec 02 20:16:08 crc kubenswrapper[4807]: I1202 20:16:08.289108 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94bd7c70-1bb7-4b0e-816e-5be2df3641da-bundle" (OuterVolumeSpecName: "bundle") pod "94bd7c70-1bb7-4b0e-816e-5be2df3641da" (UID: "94bd7c70-1bb7-4b0e-816e-5be2df3641da"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:16:08 crc kubenswrapper[4807]: I1202 20:16:08.294226 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94bd7c70-1bb7-4b0e-816e-5be2df3641da-kube-api-access-m8vw7" (OuterVolumeSpecName: "kube-api-access-m8vw7") pod "94bd7c70-1bb7-4b0e-816e-5be2df3641da" (UID: "94bd7c70-1bb7-4b0e-816e-5be2df3641da"). InnerVolumeSpecName "kube-api-access-m8vw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:16:08 crc kubenswrapper[4807]: I1202 20:16:08.318837 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94bd7c70-1bb7-4b0e-816e-5be2df3641da-util" (OuterVolumeSpecName: "util") pod "94bd7c70-1bb7-4b0e-816e-5be2df3641da" (UID: "94bd7c70-1bb7-4b0e-816e-5be2df3641da"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:16:08 crc kubenswrapper[4807]: I1202 20:16:08.389740 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8vw7\" (UniqueName: \"kubernetes.io/projected/94bd7c70-1bb7-4b0e-816e-5be2df3641da-kube-api-access-m8vw7\") on node \"crc\" DevicePath \"\"" Dec 02 20:16:08 crc kubenswrapper[4807]: I1202 20:16:08.389790 4807 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94bd7c70-1bb7-4b0e-816e-5be2df3641da-util\") on node \"crc\" DevicePath \"\"" Dec 02 20:16:08 crc kubenswrapper[4807]: I1202 20:16:08.389801 4807 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94bd7c70-1bb7-4b0e-816e-5be2df3641da-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:16:08 crc kubenswrapper[4807]: I1202 20:16:08.904591 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" event={"ID":"94bd7c70-1bb7-4b0e-816e-5be2df3641da","Type":"ContainerDied","Data":"48449941a76850f700d2fc9ba8e9e558dfc1c12c7eb983401c22d380540fa47b"} Dec 02 20:16:08 crc kubenswrapper[4807]: I1202 20:16:08.904644 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48449941a76850f700d2fc9ba8e9e558dfc1c12c7eb983401c22d380540fa47b" Dec 02 20:16:08 crc kubenswrapper[4807]: I1202 20:16:08.904670 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd" Dec 02 20:16:11 crc kubenswrapper[4807]: I1202 20:16:11.668859 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-84f48485bd-tsr5l"] Dec 02 20:16:11 crc kubenswrapper[4807]: E1202 20:16:11.669521 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94bd7c70-1bb7-4b0e-816e-5be2df3641da" containerName="extract" Dec 02 20:16:11 crc kubenswrapper[4807]: I1202 20:16:11.669539 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="94bd7c70-1bb7-4b0e-816e-5be2df3641da" containerName="extract" Dec 02 20:16:11 crc kubenswrapper[4807]: E1202 20:16:11.669556 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94bd7c70-1bb7-4b0e-816e-5be2df3641da" containerName="pull" Dec 02 20:16:11 crc kubenswrapper[4807]: I1202 20:16:11.669565 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="94bd7c70-1bb7-4b0e-816e-5be2df3641da" containerName="pull" Dec 02 20:16:11 crc kubenswrapper[4807]: E1202 20:16:11.669595 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94bd7c70-1bb7-4b0e-816e-5be2df3641da" containerName="util" Dec 02 20:16:11 crc kubenswrapper[4807]: I1202 20:16:11.669604 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="94bd7c70-1bb7-4b0e-816e-5be2df3641da" containerName="util" Dec 02 20:16:11 crc kubenswrapper[4807]: I1202 20:16:11.669752 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="94bd7c70-1bb7-4b0e-816e-5be2df3641da" containerName="extract" Dec 02 20:16:11 crc kubenswrapper[4807]: I1202 20:16:11.670310 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-84f48485bd-tsr5l" Dec 02 20:16:11 crc kubenswrapper[4807]: I1202 20:16:11.676198 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-mfv4z" Dec 02 20:16:11 crc kubenswrapper[4807]: I1202 20:16:11.719299 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-84f48485bd-tsr5l"] Dec 02 20:16:11 crc kubenswrapper[4807]: I1202 20:16:11.751448 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7jzr\" (UniqueName: \"kubernetes.io/projected/9f824d2a-934d-4e25-95dd-6323a038f878-kube-api-access-r7jzr\") pod \"openstack-operator-controller-operator-84f48485bd-tsr5l\" (UID: \"9f824d2a-934d-4e25-95dd-6323a038f878\") " pod="openstack-operators/openstack-operator-controller-operator-84f48485bd-tsr5l" Dec 02 20:16:11 crc kubenswrapper[4807]: I1202 20:16:11.853343 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7jzr\" (UniqueName: \"kubernetes.io/projected/9f824d2a-934d-4e25-95dd-6323a038f878-kube-api-access-r7jzr\") pod \"openstack-operator-controller-operator-84f48485bd-tsr5l\" (UID: \"9f824d2a-934d-4e25-95dd-6323a038f878\") " pod="openstack-operators/openstack-operator-controller-operator-84f48485bd-tsr5l" Dec 02 20:16:11 crc kubenswrapper[4807]: I1202 20:16:11.870847 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7jzr\" (UniqueName: \"kubernetes.io/projected/9f824d2a-934d-4e25-95dd-6323a038f878-kube-api-access-r7jzr\") pod \"openstack-operator-controller-operator-84f48485bd-tsr5l\" (UID: \"9f824d2a-934d-4e25-95dd-6323a038f878\") " pod="openstack-operators/openstack-operator-controller-operator-84f48485bd-tsr5l" Dec 02 20:16:11 crc kubenswrapper[4807]: I1202 20:16:11.987764 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-84f48485bd-tsr5l" Dec 02 20:16:12 crc kubenswrapper[4807]: I1202 20:16:12.435203 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-84f48485bd-tsr5l"] Dec 02 20:16:12 crc kubenswrapper[4807]: I1202 20:16:12.934814 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-84f48485bd-tsr5l" event={"ID":"9f824d2a-934d-4e25-95dd-6323a038f878","Type":"ContainerStarted","Data":"ca3a1ad4839d11663af18f520a3ec16c325d2b782eeff9d893780cb47605e21e"} Dec 02 20:16:16 crc kubenswrapper[4807]: I1202 20:16:16.980930 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-84f48485bd-tsr5l" event={"ID":"9f824d2a-934d-4e25-95dd-6323a038f878","Type":"ContainerStarted","Data":"98142b70fb9bb23cda3462cb0d80ef648cb27099357818d838570d2e101d8d9c"} Dec 02 20:16:16 crc kubenswrapper[4807]: I1202 20:16:16.981004 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-84f48485bd-tsr5l" Dec 02 20:16:17 crc kubenswrapper[4807]: I1202 20:16:17.012382 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-84f48485bd-tsr5l" podStartSLOduration=1.8120744100000001 podStartE2EDuration="6.012365506s" podCreationTimestamp="2025-12-02 20:16:11 +0000 UTC" firstStartedPulling="2025-12-02 20:16:12.442446228 +0000 UTC m=+1107.743353723" lastFinishedPulling="2025-12-02 20:16:16.642737324 +0000 UTC m=+1111.943644819" observedRunningTime="2025-12-02 20:16:17.011381617 +0000 UTC m=+1112.312289122" watchObservedRunningTime="2025-12-02 20:16:17.012365506 +0000 UTC m=+1112.313273001" Dec 02 20:16:21 crc kubenswrapper[4807]: I1202 20:16:21.991699 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-84f48485bd-tsr5l" Dec 02 20:16:28 crc kubenswrapper[4807]: I1202 20:16:28.293139 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:16:28 crc kubenswrapper[4807]: I1202 20:16:28.293742 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:16:49 crc kubenswrapper[4807]: I1202 20:16:49.929686 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-r9jtf"] Dec 02 20:16:49 crc kubenswrapper[4807]: I1202 20:16:49.931531 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r9jtf" Dec 02 20:16:49 crc kubenswrapper[4807]: I1202 20:16:49.936683 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-g2lf7"] Dec 02 20:16:49 crc kubenswrapper[4807]: I1202 20:16:49.937745 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ztfxx" Dec 02 20:16:49 crc kubenswrapper[4807]: I1202 20:16:49.938447 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g2lf7" Dec 02 20:16:49 crc kubenswrapper[4807]: I1202 20:16:49.940101 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-zdkhk" Dec 02 20:16:49 crc kubenswrapper[4807]: I1202 20:16:49.951111 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-g2lf7"] Dec 02 20:16:49 crc kubenswrapper[4807]: I1202 20:16:49.965012 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-p2b2h"] Dec 02 20:16:49 crc kubenswrapper[4807]: I1202 20:16:49.966230 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-p2b2h" Dec 02 20:16:49 crc kubenswrapper[4807]: I1202 20:16:49.968215 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-nwh96" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.015832 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-mss4c"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.016928 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-mss4c" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.023674 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-p2b2h"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.025942 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-h6xjr" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.030594 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwcl5\" (UniqueName: \"kubernetes.io/projected/357ab26f-5ac4-46a1-b8f3-89db969b4082-kube-api-access-lwcl5\") pod \"designate-operator-controller-manager-78b4bc895b-p2b2h\" (UID: \"357ab26f-5ac4-46a1-b8f3-89db969b4082\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-p2b2h" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.030655 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ndpw\" (UniqueName: \"kubernetes.io/projected/cfb9049e-2275-4d96-9131-29bb4def714b-kube-api-access-8ndpw\") pod \"glance-operator-controller-manager-77987cd8cd-mss4c\" (UID: \"cfb9049e-2275-4d96-9131-29bb4def714b\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-mss4c" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.030840 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5h5g\" (UniqueName: \"kubernetes.io/projected/7c007dd6-7efa-47c1-af56-ce0bf8fd6f37-kube-api-access-x5h5g\") pod \"barbican-operator-controller-manager-7d9dfd778-r9jtf\" (UID: \"7c007dd6-7efa-47c1-af56-ce0bf8fd6f37\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r9jtf" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.030877 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ldhg\" (UniqueName: \"kubernetes.io/projected/3982be8e-b5d2-4795-9312-f3ba8466209c-kube-api-access-4ldhg\") pod \"cinder-operator-controller-manager-859b6ccc6-g2lf7\" (UID: \"3982be8e-b5d2-4795-9312-f3ba8466209c\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g2lf7" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.039612 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-r9jtf"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.047831 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-mss4c"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.064498 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kq246"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.065482 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kq246" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.070313 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-nh64p" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.080421 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6mvz2"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.081400 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6mvz2" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.091078 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9lf72" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.095798 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kq246"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.117792 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6mvz2"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.126473 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-tz65v"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.127874 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-tz65v" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.131082 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-vrj87" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.131988 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ldhg\" (UniqueName: \"kubernetes.io/projected/3982be8e-b5d2-4795-9312-f3ba8466209c-kube-api-access-4ldhg\") pod \"cinder-operator-controller-manager-859b6ccc6-g2lf7\" (UID: \"3982be8e-b5d2-4795-9312-f3ba8466209c\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g2lf7" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.132038 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkhqn\" (UniqueName: \"kubernetes.io/projected/f376fe60-0cdf-4b30-ab61-80178d738ea4-kube-api-access-jkhqn\") pod \"horizon-operator-controller-manager-68c6d99b8f-6mvz2\" (UID: \"f376fe60-0cdf-4b30-ab61-80178d738ea4\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6mvz2" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.132063 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8wwk\" (UniqueName: \"kubernetes.io/projected/5ceaf50b-92b7-4069-b9b8-660e90c55d97-kube-api-access-f8wwk\") pod \"ironic-operator-controller-manager-6c548fd776-tz65v\" (UID: \"5ceaf50b-92b7-4069-b9b8-660e90c55d97\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-tz65v" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.132097 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwcl5\" (UniqueName: \"kubernetes.io/projected/357ab26f-5ac4-46a1-b8f3-89db969b4082-kube-api-access-lwcl5\") pod \"designate-operator-controller-manager-78b4bc895b-p2b2h\" (UID: \"357ab26f-5ac4-46a1-b8f3-89db969b4082\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-p2b2h" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.132120 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ndpw\" (UniqueName: \"kubernetes.io/projected/cfb9049e-2275-4d96-9131-29bb4def714b-kube-api-access-8ndpw\") pod \"glance-operator-controller-manager-77987cd8cd-mss4c\" (UID: \"cfb9049e-2275-4d96-9131-29bb4def714b\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-mss4c" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.132157 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hjsw\" (UniqueName: \"kubernetes.io/projected/3c3d38aa-3600-41f6-97b3-e3699796526e-kube-api-access-7hjsw\") pod \"heat-operator-controller-manager-5f64f6f8bb-kq246\" (UID: \"3c3d38aa-3600-41f6-97b3-e3699796526e\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kq246" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.132201 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5h5g\" (UniqueName: \"kubernetes.io/projected/7c007dd6-7efa-47c1-af56-ce0bf8fd6f37-kube-api-access-x5h5g\") pod \"barbican-operator-controller-manager-7d9dfd778-r9jtf\" (UID: \"7c007dd6-7efa-47c1-af56-ce0bf8fd6f37\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r9jtf" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.137690 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.138997 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.142603 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-x4brg" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.142953 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.156692 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.157819 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.159911 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-tz65v"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.164145 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-t2snd" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.182573 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5h5g\" (UniqueName: \"kubernetes.io/projected/7c007dd6-7efa-47c1-af56-ce0bf8fd6f37-kube-api-access-x5h5g\") pod \"barbican-operator-controller-manager-7d9dfd778-r9jtf\" (UID: \"7c007dd6-7efa-47c1-af56-ce0bf8fd6f37\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r9jtf" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.182588 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ldhg\" (UniqueName: \"kubernetes.io/projected/3982be8e-b5d2-4795-9312-f3ba8466209c-kube-api-access-4ldhg\") pod \"cinder-operator-controller-manager-859b6ccc6-g2lf7\" (UID: \"3982be8e-b5d2-4795-9312-f3ba8466209c\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g2lf7" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.183909 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ndpw\" (UniqueName: \"kubernetes.io/projected/cfb9049e-2275-4d96-9131-29bb4def714b-kube-api-access-8ndpw\") pod \"glance-operator-controller-manager-77987cd8cd-mss4c\" (UID: \"cfb9049e-2275-4d96-9131-29bb4def714b\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-mss4c" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.184353 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwcl5\" (UniqueName: \"kubernetes.io/projected/357ab26f-5ac4-46a1-b8f3-89db969b4082-kube-api-access-lwcl5\") pod \"designate-operator-controller-manager-78b4bc895b-p2b2h\" (UID: \"357ab26f-5ac4-46a1-b8f3-89db969b4082\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-p2b2h" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.186833 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-wzdmc"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.188066 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-wzdmc" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.189998 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-gkkc5" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.193757 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.200331 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.225927 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-wzdmc"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.235280 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkhqn\" (UniqueName: \"kubernetes.io/projected/f376fe60-0cdf-4b30-ab61-80178d738ea4-kube-api-access-jkhqn\") pod \"horizon-operator-controller-manager-68c6d99b8f-6mvz2\" (UID: \"f376fe60-0cdf-4b30-ab61-80178d738ea4\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6mvz2" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.235419 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8wwk\" (UniqueName: \"kubernetes.io/projected/5ceaf50b-92b7-4069-b9b8-660e90c55d97-kube-api-access-f8wwk\") pod \"ironic-operator-controller-manager-6c548fd776-tz65v\" (UID: \"5ceaf50b-92b7-4069-b9b8-660e90c55d97\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-tz65v" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.235515 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert\") pod \"infra-operator-controller-manager-57548d458d-dn7v2\" (UID: \"d7712aec-0995-489a-8cee-7e68fbf130df\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.235593 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmvtw\" (UniqueName: \"kubernetes.io/projected/fa1a5516-5c0d-4d4e-b052-d9301371a2d3-kube-api-access-lmvtw\") pod \"manila-operator-controller-manager-7c79b5df47-wzdmc\" (UID: \"fa1a5516-5c0d-4d4e-b052-d9301371a2d3\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-wzdmc" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.235758 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hjsw\" (UniqueName: \"kubernetes.io/projected/3c3d38aa-3600-41f6-97b3-e3699796526e-kube-api-access-7hjsw\") pod \"heat-operator-controller-manager-5f64f6f8bb-kq246\" (UID: \"3c3d38aa-3600-41f6-97b3-e3699796526e\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kq246" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.235850 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhjxr\" (UniqueName: \"kubernetes.io/projected/9198e60c-4301-40b6-9d1b-3e91a2f10fa5-kube-api-access-bhjxr\") pod \"keystone-operator-controller-manager-7765d96ddf-gmwq8\" (UID: \"9198e60c-4301-40b6-9d1b-3e91a2f10fa5\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.235926 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lt2c\" (UniqueName: \"kubernetes.io/projected/d7712aec-0995-489a-8cee-7e68fbf130df-kube-api-access-4lt2c\") pod \"infra-operator-controller-manager-57548d458d-dn7v2\" (UID: \"d7712aec-0995-489a-8cee-7e68fbf130df\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.255802 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gd4bj"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.257216 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gd4bj" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.263002 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r9jtf" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.263806 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-6pjz7" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.265370 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g2lf7" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.266051 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hjsw\" (UniqueName: \"kubernetes.io/projected/3c3d38aa-3600-41f6-97b3-e3699796526e-kube-api-access-7hjsw\") pod \"heat-operator-controller-manager-5f64f6f8bb-kq246\" (UID: \"3c3d38aa-3600-41f6-97b3-e3699796526e\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kq246" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.287411 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkhqn\" (UniqueName: \"kubernetes.io/projected/f376fe60-0cdf-4b30-ab61-80178d738ea4-kube-api-access-jkhqn\") pod \"horizon-operator-controller-manager-68c6d99b8f-6mvz2\" (UID: \"f376fe60-0cdf-4b30-ab61-80178d738ea4\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6mvz2" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.295368 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-p2b2h" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.296502 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8wwk\" (UniqueName: \"kubernetes.io/projected/5ceaf50b-92b7-4069-b9b8-660e90c55d97-kube-api-access-f8wwk\") pod \"ironic-operator-controller-manager-6c548fd776-tz65v\" (UID: \"5ceaf50b-92b7-4069-b9b8-660e90c55d97\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-tz65v" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.331295 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-g8559"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.334431 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-g8559" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.338914 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dg5p\" (UniqueName: \"kubernetes.io/projected/7cccb577-4849-4e1c-b38e-669f7658eb2e-kube-api-access-9dg5p\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-gd4bj\" (UID: \"7cccb577-4849-4e1c-b38e-669f7658eb2e\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gd4bj" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.339089 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert\") pod \"infra-operator-controller-manager-57548d458d-dn7v2\" (UID: \"d7712aec-0995-489a-8cee-7e68fbf130df\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.339120 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwbp2\" (UniqueName: \"kubernetes.io/projected/6511dc8d-00b4-4937-a330-0f5cf9c06fdd-kube-api-access-vwbp2\") pod \"nova-operator-controller-manager-697bc559fc-g8559\" (UID: \"6511dc8d-00b4-4937-a330-0f5cf9c06fdd\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-g8559" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.339152 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmvtw\" (UniqueName: \"kubernetes.io/projected/fa1a5516-5c0d-4d4e-b052-d9301371a2d3-kube-api-access-lmvtw\") pod \"manila-operator-controller-manager-7c79b5df47-wzdmc\" (UID: \"fa1a5516-5c0d-4d4e-b052-d9301371a2d3\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-wzdmc" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.339454 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhjxr\" (UniqueName: \"kubernetes.io/projected/9198e60c-4301-40b6-9d1b-3e91a2f10fa5-kube-api-access-bhjxr\") pod \"keystone-operator-controller-manager-7765d96ddf-gmwq8\" (UID: \"9198e60c-4301-40b6-9d1b-3e91a2f10fa5\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.339505 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lt2c\" (UniqueName: \"kubernetes.io/projected/d7712aec-0995-489a-8cee-7e68fbf130df-kube-api-access-4lt2c\") pod \"infra-operator-controller-manager-57548d458d-dn7v2\" (UID: \"d7712aec-0995-489a-8cee-7e68fbf130df\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.339811 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-mss4c" Dec 02 20:16:50 crc kubenswrapper[4807]: E1202 20:16:50.340330 4807 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 20:16:50 crc kubenswrapper[4807]: E1202 20:16:50.340471 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert podName:d7712aec-0995-489a-8cee-7e68fbf130df nodeName:}" failed. No retries permitted until 2025-12-02 20:16:50.840450871 +0000 UTC m=+1146.141358366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert") pod "infra-operator-controller-manager-57548d458d-dn7v2" (UID: "d7712aec-0995-489a-8cee-7e68fbf130df") : secret "infra-operator-webhook-server-cert" not found Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.407684 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6mvz2" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.409226 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-2jn2m" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.411029 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kq246" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.423506 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lt2c\" (UniqueName: \"kubernetes.io/projected/d7712aec-0995-489a-8cee-7e68fbf130df-kube-api-access-4lt2c\") pod \"infra-operator-controller-manager-57548d458d-dn7v2\" (UID: \"d7712aec-0995-489a-8cee-7e68fbf130df\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.431396 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmvtw\" (UniqueName: \"kubernetes.io/projected/fa1a5516-5c0d-4d4e-b052-d9301371a2d3-kube-api-access-lmvtw\") pod \"manila-operator-controller-manager-7c79b5df47-wzdmc\" (UID: \"fa1a5516-5c0d-4d4e-b052-d9301371a2d3\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-wzdmc" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.442961 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dg5p\" (UniqueName: \"kubernetes.io/projected/7cccb577-4849-4e1c-b38e-669f7658eb2e-kube-api-access-9dg5p\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-gd4bj\" (UID: \"7cccb577-4849-4e1c-b38e-669f7658eb2e\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gd4bj" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.443021 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwbp2\" (UniqueName: \"kubernetes.io/projected/6511dc8d-00b4-4937-a330-0f5cf9c06fdd-kube-api-access-vwbp2\") pod \"nova-operator-controller-manager-697bc559fc-g8559\" (UID: \"6511dc8d-00b4-4937-a330-0f5cf9c06fdd\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-g8559" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.448807 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gd4bj"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.463654 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-tz65v" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.494444 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.495775 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.509594 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-fl4tv" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.544585 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwx22\" (UniqueName: \"kubernetes.io/projected/490b3442-f4b4-493d-824a-67e370ac26f9-kube-api-access-dwx22\") pod \"mariadb-operator-controller-manager-56bbcc9d85-52v8m\" (UID: \"490b3442-f4b4-493d-824a-67e370ac26f9\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.553158 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-wzdmc" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.574668 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhjxr\" (UniqueName: \"kubernetes.io/projected/9198e60c-4301-40b6-9d1b-3e91a2f10fa5-kube-api-access-bhjxr\") pod \"keystone-operator-controller-manager-7765d96ddf-gmwq8\" (UID: \"9198e60c-4301-40b6-9d1b-3e91a2f10fa5\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.597484 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dg5p\" (UniqueName: \"kubernetes.io/projected/7cccb577-4849-4e1c-b38e-669f7658eb2e-kube-api-access-9dg5p\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-gd4bj\" (UID: \"7cccb577-4849-4e1c-b38e-669f7658eb2e\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gd4bj" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.613574 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-9g547"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.614761 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9g547" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.624464 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-fqmsb" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.643206 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-g8559"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.644810 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwbp2\" (UniqueName: \"kubernetes.io/projected/6511dc8d-00b4-4937-a330-0f5cf9c06fdd-kube-api-access-vwbp2\") pod \"nova-operator-controller-manager-697bc559fc-g8559\" (UID: \"6511dc8d-00b4-4937-a330-0f5cf9c06fdd\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-g8559" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.645626 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwx22\" (UniqueName: \"kubernetes.io/projected/490b3442-f4b4-493d-824a-67e370ac26f9-kube-api-access-dwx22\") pod \"mariadb-operator-controller-manager-56bbcc9d85-52v8m\" (UID: \"490b3442-f4b4-493d-824a-67e370ac26f9\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.655673 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.696279 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwx22\" (UniqueName: \"kubernetes.io/projected/490b3442-f4b4-493d-824a-67e370ac26f9-kube-api-access-dwx22\") pod \"mariadb-operator-controller-manager-56bbcc9d85-52v8m\" (UID: \"490b3442-f4b4-493d-824a-67e370ac26f9\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.703712 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-9g547"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.723854 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.725227 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-4qmqw"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.725576 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.727366 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.728367 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-wbz6p" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.730883 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4qmqw" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.738981 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-682jd" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.739059 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-nh4hg"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.740319 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nh4hg" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.742860 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-n9tm2" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.751500 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64vw9\" (UniqueName: \"kubernetes.io/projected/1f4141f5-ba14-4c49-b114-07e5d506b255-kube-api-access-64vw9\") pod \"octavia-operator-controller-manager-998648c74-9g547\" (UID: \"1f4141f5-ba14-4c49-b114-07e5d506b255\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-9g547" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.753669 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gd4bj" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.759578 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-4qmqw"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.766208 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-nh4hg"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.781265 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-g8559" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.781334 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.788710 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-7pqn2"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.789910 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-7pqn2" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.796807 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-7pqn2"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.797621 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-5r9jf" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.809227 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.811139 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.815225 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-p265z" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.828084 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.834108 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.847827 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-qrthg"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.848993 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qrthg" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.851497 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-qrthg"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.853216 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert\") pod \"infra-operator-controller-manager-57548d458d-dn7v2\" (UID: \"d7712aec-0995-489a-8cee-7e68fbf130df\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.853406 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzhz6\" (UniqueName: \"kubernetes.io/projected/b2ef8498-337b-40c6-b122-19863c876321-kube-api-access-jzhz6\") pod \"placement-operator-controller-manager-78f8948974-nh4hg\" (UID: \"b2ef8498-337b-40c6-b122-19863c876321\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-nh4hg" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.853572 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk\" (UID: \"e0e35837-6389-4e86-b8c5-46105f1332cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.853652 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64vw9\" (UniqueName: \"kubernetes.io/projected/1f4141f5-ba14-4c49-b114-07e5d506b255-kube-api-access-64vw9\") pod \"octavia-operator-controller-manager-998648c74-9g547\" (UID: \"1f4141f5-ba14-4c49-b114-07e5d506b255\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-9g547" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.853779 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn5lr\" (UniqueName: \"kubernetes.io/projected/959355af-f6bd-492c-af58-9a7378224225-kube-api-access-wn5lr\") pod \"ovn-operator-controller-manager-b6456fdb6-4qmqw\" (UID: \"959355af-f6bd-492c-af58-9a7378224225\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4qmqw" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.854961 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwvrc\" (UniqueName: \"kubernetes.io/projected/e0e35837-6389-4e86-b8c5-46105f1332cb-kube-api-access-lwvrc\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk\" (UID: \"e0e35837-6389-4e86-b8c5-46105f1332cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" Dec 02 20:16:50 crc kubenswrapper[4807]: E1202 20:16:50.853832 4807 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 20:16:50 crc kubenswrapper[4807]: E1202 20:16:50.855800 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert podName:d7712aec-0995-489a-8cee-7e68fbf130df nodeName:}" failed. No retries permitted until 2025-12-02 20:16:51.855776323 +0000 UTC m=+1147.156683828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert") pod "infra-operator-controller-manager-57548d458d-dn7v2" (UID: "d7712aec-0995-489a-8cee-7e68fbf130df") : secret "infra-operator-webhook-server-cert" not found Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.855286 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-p6mh4" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.866237 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.868860 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.872735 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5"] Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.885792 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.888250 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64vw9\" (UniqueName: \"kubernetes.io/projected/1f4141f5-ba14-4c49-b114-07e5d506b255-kube-api-access-64vw9\") pod \"octavia-operator-controller-manager-998648c74-9g547\" (UID: \"1f4141f5-ba14-4c49-b114-07e5d506b255\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-9g547" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.891536 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-th7kd" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.968771 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc4ht\" (UniqueName: \"kubernetes.io/projected/733f7038-d2b9-4047-8ee9-3ad49a55729d-kube-api-access-vc4ht\") pod \"test-operator-controller-manager-5854674fcc-qrthg\" (UID: \"733f7038-d2b9-4047-8ee9-3ad49a55729d\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-qrthg" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.968918 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzhz6\" (UniqueName: \"kubernetes.io/projected/b2ef8498-337b-40c6-b122-19863c876321-kube-api-access-jzhz6\") pod \"placement-operator-controller-manager-78f8948974-nh4hg\" (UID: \"b2ef8498-337b-40c6-b122-19863c876321\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-nh4hg" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.968960 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk\" (UID: \"e0e35837-6389-4e86-b8c5-46105f1332cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.969181 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn5lr\" (UniqueName: \"kubernetes.io/projected/959355af-f6bd-492c-af58-9a7378224225-kube-api-access-wn5lr\") pod \"ovn-operator-controller-manager-b6456fdb6-4qmqw\" (UID: \"959355af-f6bd-492c-af58-9a7378224225\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4qmqw" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.985789 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwvrc\" (UniqueName: \"kubernetes.io/projected/e0e35837-6389-4e86-b8c5-46105f1332cb-kube-api-access-lwvrc\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk\" (UID: \"e0e35837-6389-4e86-b8c5-46105f1332cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.985941 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5657t\" (UniqueName: \"kubernetes.io/projected/8dd00692-0728-47cf-b8fa-ab812b11ec8f-kube-api-access-5657t\") pod \"telemetry-operator-controller-manager-76cc84c6bb-tzhtf\" (UID: \"8dd00692-0728-47cf-b8fa-ab812b11ec8f\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf" Dec 02 20:16:50 crc kubenswrapper[4807]: I1202 20:16:50.985967 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mnmf\" (UniqueName: \"kubernetes.io/projected/b53c3dbf-2380-4dff-9b18-a2207efcce60-kube-api-access-2mnmf\") pod \"swift-operator-controller-manager-5f8c65bbfc-7pqn2\" (UID: \"b53c3dbf-2380-4dff-9b18-a2207efcce60\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-7pqn2" Dec 02 20:16:50 crc kubenswrapper[4807]: E1202 20:16:50.989808 4807 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 20:16:50 crc kubenswrapper[4807]: E1202 20:16:50.989891 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert podName:e0e35837-6389-4e86-b8c5-46105f1332cb nodeName:}" failed. No retries permitted until 2025-12-02 20:16:51.489863341 +0000 UTC m=+1146.790770836 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" (UID: "e0e35837-6389-4e86-b8c5-46105f1332cb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.039557 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzhz6\" (UniqueName: \"kubernetes.io/projected/b2ef8498-337b-40c6-b122-19863c876321-kube-api-access-jzhz6\") pod \"placement-operator-controller-manager-78f8948974-nh4hg\" (UID: \"b2ef8498-337b-40c6-b122-19863c876321\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-nh4hg" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.043056 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwvrc\" (UniqueName: \"kubernetes.io/projected/e0e35837-6389-4e86-b8c5-46105f1332cb-kube-api-access-lwvrc\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk\" (UID: \"e0e35837-6389-4e86-b8c5-46105f1332cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.092667 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn5lr\" (UniqueName: \"kubernetes.io/projected/959355af-f6bd-492c-af58-9a7378224225-kube-api-access-wn5lr\") pod \"ovn-operator-controller-manager-b6456fdb6-4qmqw\" (UID: \"959355af-f6bd-492c-af58-9a7378224225\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4qmqw" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.104819 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mnmf\" (UniqueName: \"kubernetes.io/projected/b53c3dbf-2380-4dff-9b18-a2207efcce60-kube-api-access-2mnmf\") pod \"swift-operator-controller-manager-5f8c65bbfc-7pqn2\" (UID: \"b53c3dbf-2380-4dff-9b18-a2207efcce60\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-7pqn2" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.105588 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc4ht\" (UniqueName: \"kubernetes.io/projected/733f7038-d2b9-4047-8ee9-3ad49a55729d-kube-api-access-vc4ht\") pod \"test-operator-controller-manager-5854674fcc-qrthg\" (UID: \"733f7038-d2b9-4047-8ee9-3ad49a55729d\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-qrthg" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.141430 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tbqj\" (UniqueName: \"kubernetes.io/projected/2a376983-2f33-465c-9781-391b67941e21-kube-api-access-8tbqj\") pod \"watcher-operator-controller-manager-58888ff59d-cwws5\" (UID: \"2a376983-2f33-465c-9781-391b67941e21\") " pod="openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.141841 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5657t\" (UniqueName: \"kubernetes.io/projected/8dd00692-0728-47cf-b8fa-ab812b11ec8f-kube-api-access-5657t\") pod \"telemetry-operator-controller-manager-76cc84c6bb-tzhtf\" (UID: \"8dd00692-0728-47cf-b8fa-ab812b11ec8f\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.151063 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9g547" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.176690 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh"] Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.177635 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.183234 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7vf4b" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.196513 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5657t\" (UniqueName: \"kubernetes.io/projected/8dd00692-0728-47cf-b8fa-ab812b11ec8f-kube-api-access-5657t\") pod \"telemetry-operator-controller-manager-76cc84c6bb-tzhtf\" (UID: \"8dd00692-0728-47cf-b8fa-ab812b11ec8f\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.196588 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh"] Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.197695 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mnmf\" (UniqueName: \"kubernetes.io/projected/b53c3dbf-2380-4dff-9b18-a2207efcce60-kube-api-access-2mnmf\") pod \"swift-operator-controller-manager-5f8c65bbfc-7pqn2\" (UID: \"b53c3dbf-2380-4dff-9b18-a2207efcce60\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-7pqn2" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.198052 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nh4hg" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.198882 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.199088 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.199461 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc4ht\" (UniqueName: \"kubernetes.io/projected/733f7038-d2b9-4047-8ee9-3ad49a55729d-kube-api-access-vc4ht\") pod \"test-operator-controller-manager-5854674fcc-qrthg\" (UID: \"733f7038-d2b9-4047-8ee9-3ad49a55729d\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-qrthg" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.222542 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-7pqn2" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.228976 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-t8rj2"] Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.231875 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-t8rj2" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.234586 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-t8rj2"] Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.240045 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4qmqw" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.253760 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tbqj\" (UniqueName: \"kubernetes.io/projected/2a376983-2f33-465c-9781-391b67941e21-kube-api-access-8tbqj\") pod \"watcher-operator-controller-manager-58888ff59d-cwws5\" (UID: \"2a376983-2f33-465c-9781-391b67941e21\") " pod="openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.253969 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-6w8zm" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.290812 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tbqj\" (UniqueName: \"kubernetes.io/projected/2a376983-2f33-465c-9781-391b67941e21-kube-api-access-8tbqj\") pod \"watcher-operator-controller-manager-58888ff59d-cwws5\" (UID: \"2a376983-2f33-465c-9781-391b67941e21\") " pod="openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.300142 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.326258 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.408824 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.409988 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qrthg" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.412236 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zvmg\" (UniqueName: \"kubernetes.io/projected/4039c119-ee84-4043-8892-733499aabdc5-kube-api-access-9zvmg\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.412437 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.412544 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mxml\" (UniqueName: \"kubernetes.io/projected/3035086c-2661-4720-97b1-df4d0cd891a6-kube-api-access-7mxml\") pod \"rabbitmq-cluster-operator-manager-668c99d594-t8rj2\" (UID: \"3035086c-2661-4720-97b1-df4d0cd891a6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-t8rj2" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.417955 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-r9jtf"] Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.462584 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-g2lf7"] Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.514154 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.514222 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mxml\" (UniqueName: \"kubernetes.io/projected/3035086c-2661-4720-97b1-df4d0cd891a6-kube-api-access-7mxml\") pod \"rabbitmq-cluster-operator-manager-668c99d594-t8rj2\" (UID: \"3035086c-2661-4720-97b1-df4d0cd891a6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-t8rj2" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.514310 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk\" (UID: \"e0e35837-6389-4e86-b8c5-46105f1332cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.514356 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.514393 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zvmg\" (UniqueName: \"kubernetes.io/projected/4039c119-ee84-4043-8892-733499aabdc5-kube-api-access-9zvmg\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:16:51 crc kubenswrapper[4807]: E1202 20:16:51.514952 4807 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 20:16:51 crc kubenswrapper[4807]: E1202 20:16:51.515005 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs podName:4039c119-ee84-4043-8892-733499aabdc5 nodeName:}" failed. No retries permitted until 2025-12-02 20:16:52.01498889 +0000 UTC m=+1147.315896385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs") pod "openstack-operator-controller-manager-57fb6dd487-lffvh" (UID: "4039c119-ee84-4043-8892-733499aabdc5") : secret "webhook-server-cert" not found Dec 02 20:16:51 crc kubenswrapper[4807]: E1202 20:16:51.515098 4807 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 20:16:51 crc kubenswrapper[4807]: E1202 20:16:51.515124 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert podName:e0e35837-6389-4e86-b8c5-46105f1332cb nodeName:}" failed. No retries permitted until 2025-12-02 20:16:52.515117874 +0000 UTC m=+1147.816025369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" (UID: "e0e35837-6389-4e86-b8c5-46105f1332cb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 20:16:51 crc kubenswrapper[4807]: E1202 20:16:51.515160 4807 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 20:16:51 crc kubenswrapper[4807]: E1202 20:16:51.515178 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs podName:4039c119-ee84-4043-8892-733499aabdc5 nodeName:}" failed. No retries permitted until 2025-12-02 20:16:52.015172136 +0000 UTC m=+1147.316079621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs") pod "openstack-operator-controller-manager-57fb6dd487-lffvh" (UID: "4039c119-ee84-4043-8892-733499aabdc5") : secret "metrics-server-cert" not found Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.556743 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zvmg\" (UniqueName: \"kubernetes.io/projected/4039c119-ee84-4043-8892-733499aabdc5-kube-api-access-9zvmg\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.558854 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mxml\" (UniqueName: \"kubernetes.io/projected/3035086c-2661-4720-97b1-df4d0cd891a6-kube-api-access-7mxml\") pod \"rabbitmq-cluster-operator-manager-668c99d594-t8rj2\" (UID: \"3035086c-2661-4720-97b1-df4d0cd891a6\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-t8rj2" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.612876 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-t8rj2" Dec 02 20:16:51 crc kubenswrapper[4807]: I1202 20:16:51.929852 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert\") pod \"infra-operator-controller-manager-57548d458d-dn7v2\" (UID: \"d7712aec-0995-489a-8cee-7e68fbf130df\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" Dec 02 20:16:51 crc kubenswrapper[4807]: E1202 20:16:51.930285 4807 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 20:16:51 crc kubenswrapper[4807]: E1202 20:16:51.930343 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert podName:d7712aec-0995-489a-8cee-7e68fbf130df nodeName:}" failed. No retries permitted until 2025-12-02 20:16:53.930324946 +0000 UTC m=+1149.231232431 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert") pod "infra-operator-controller-manager-57548d458d-dn7v2" (UID: "d7712aec-0995-489a-8cee-7e68fbf130df") : secret "infra-operator-webhook-server-cert" not found Dec 02 20:16:52 crc kubenswrapper[4807]: I1202 20:16:52.031447 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:16:52 crc kubenswrapper[4807]: I1202 20:16:52.031548 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:16:52 crc kubenswrapper[4807]: E1202 20:16:52.031705 4807 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 20:16:52 crc kubenswrapper[4807]: E1202 20:16:52.031868 4807 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 20:16:52 crc kubenswrapper[4807]: E1202 20:16:52.032379 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs podName:4039c119-ee84-4043-8892-733499aabdc5 nodeName:}" failed. No retries permitted until 2025-12-02 20:16:53.032307251 +0000 UTC m=+1148.333214746 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs") pod "openstack-operator-controller-manager-57fb6dd487-lffvh" (UID: "4039c119-ee84-4043-8892-733499aabdc5") : secret "metrics-server-cert" not found Dec 02 20:16:52 crc kubenswrapper[4807]: E1202 20:16:52.032394 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs podName:4039c119-ee84-4043-8892-733499aabdc5 nodeName:}" failed. No retries permitted until 2025-12-02 20:16:53.032388763 +0000 UTC m=+1148.333296258 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs") pod "openstack-operator-controller-manager-57fb6dd487-lffvh" (UID: "4039c119-ee84-4043-8892-733499aabdc5") : secret "webhook-server-cert" not found Dec 02 20:16:52 crc kubenswrapper[4807]: I1202 20:16:52.430111 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-mss4c"] Dec 02 20:16:52 crc kubenswrapper[4807]: I1202 20:16:52.433317 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r9jtf" event={"ID":"7c007dd6-7efa-47c1-af56-ce0bf8fd6f37","Type":"ContainerStarted","Data":"c9fcd81409d4e176e18b87b45f88f795da6d298e60492572cee79f3fb900c1a6"} Dec 02 20:16:52 crc kubenswrapper[4807]: W1202 20:16:52.434154 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfb9049e_2275_4d96_9131_29bb4def714b.slice/crio-9b0ecaf6909ca6331360fb75345c43d13b302195f0c444596ca2f6d34eed0567 WatchSource:0}: Error finding container 9b0ecaf6909ca6331360fb75345c43d13b302195f0c444596ca2f6d34eed0567: Status 404 returned error can't find the container with id 9b0ecaf6909ca6331360fb75345c43d13b302195f0c444596ca2f6d34eed0567 Dec 02 20:16:52 crc kubenswrapper[4807]: I1202 20:16:52.435380 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g2lf7" event={"ID":"3982be8e-b5d2-4795-9312-f3ba8466209c","Type":"ContainerStarted","Data":"85f6fba1984d9eafdf88191f3f2269a454bf8502a610ad65643e5b7d7ba2c45e"} Dec 02 20:16:52 crc kubenswrapper[4807]: I1202 20:16:52.550597 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk\" (UID: \"e0e35837-6389-4e86-b8c5-46105f1332cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" Dec 02 20:16:52 crc kubenswrapper[4807]: E1202 20:16:52.550859 4807 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 20:16:52 crc kubenswrapper[4807]: E1202 20:16:52.550976 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert podName:e0e35837-6389-4e86-b8c5-46105f1332cb nodeName:}" failed. No retries permitted until 2025-12-02 20:16:54.5509465 +0000 UTC m=+1149.851853995 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" (UID: "e0e35837-6389-4e86-b8c5-46105f1332cb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 20:16:52 crc kubenswrapper[4807]: I1202 20:16:52.845698 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-tz65v"] Dec 02 20:16:52 crc kubenswrapper[4807]: W1202 20:16:52.851969 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ceaf50b_92b7_4069_b9b8_660e90c55d97.slice/crio-58c4e52fa4daf9e1cb13934ad49ac03ebef83873cf96a49721b3ac3925f5ea7d WatchSource:0}: Error finding container 58c4e52fa4daf9e1cb13934ad49ac03ebef83873cf96a49721b3ac3925f5ea7d: Status 404 returned error can't find the container with id 58c4e52fa4daf9e1cb13934ad49ac03ebef83873cf96a49721b3ac3925f5ea7d Dec 02 20:16:52 crc kubenswrapper[4807]: I1202 20:16:52.889739 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kq246"] Dec 02 20:16:52 crc kubenswrapper[4807]: W1202 20:16:52.901991 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c3d38aa_3600_41f6_97b3_e3699796526e.slice/crio-87904bfc6c5dfdebccf17b804ebb033c0ef41cf0b723ae16a9f656916cd2f076 WatchSource:0}: Error finding container 87904bfc6c5dfdebccf17b804ebb033c0ef41cf0b723ae16a9f656916cd2f076: Status 404 returned error can't find the container with id 87904bfc6c5dfdebccf17b804ebb033c0ef41cf0b723ae16a9f656916cd2f076 Dec 02 20:16:52 crc kubenswrapper[4807]: I1202 20:16:52.910219 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gd4bj"] Dec 02 20:16:52 crc kubenswrapper[4807]: I1202 20:16:52.937078 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-wzdmc"] Dec 02 20:16:52 crc kubenswrapper[4807]: I1202 20:16:52.949186 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-p2b2h"] Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.057789 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.057891 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.058065 4807 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.058187 4807 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.058241 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs podName:4039c119-ee84-4043-8892-733499aabdc5 nodeName:}" failed. No retries permitted until 2025-12-02 20:16:55.058223825 +0000 UTC m=+1150.359131320 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs") pod "openstack-operator-controller-manager-57fb6dd487-lffvh" (UID: "4039c119-ee84-4043-8892-733499aabdc5") : secret "metrics-server-cert" not found Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.058954 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs podName:4039c119-ee84-4043-8892-733499aabdc5 nodeName:}" failed. No retries permitted until 2025-12-02 20:16:55.058941946 +0000 UTC m=+1150.359849441 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs") pod "openstack-operator-controller-manager-57fb6dd487-lffvh" (UID: "4039c119-ee84-4043-8892-733499aabdc5") : secret "webhook-server-cert" not found Dec 02 20:16:53 crc kubenswrapper[4807]: W1202 20:16:53.062088 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod959355af_f6bd_492c_af58_9a7378224225.slice/crio-b913fddb49a40b249f46d37d84f38b1b5b4236202af1d18234f9a07c3c707015 WatchSource:0}: Error finding container b913fddb49a40b249f46d37d84f38b1b5b4236202af1d18234f9a07c3c707015: Status 404 returned error can't find the container with id b913fddb49a40b249f46d37d84f38b1b5b4236202af1d18234f9a07c3c707015 Dec 02 20:16:53 crc kubenswrapper[4807]: W1202 20:16:53.064555 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3035086c_2661_4720_97b1_df4d0cd891a6.slice/crio-93ee4188427319138a62de1f3379868d2eeba3abc29e631239afce0bb91f11c7 WatchSource:0}: Error finding container 93ee4188427319138a62de1f3379868d2eeba3abc29e631239afce0bb91f11c7: Status 404 returned error can't find the container with id 93ee4188427319138a62de1f3379868d2eeba3abc29e631239afce0bb91f11c7 Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.065999 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8"] Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.066037 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-9g547"] Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.073625 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-nh4hg"] Dec 02 20:16:53 crc kubenswrapper[4807]: W1202 20:16:53.075330 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9198e60c_4301_40b6_9d1b_3e91a2f10fa5.slice/crio-f77ad8a424e3ab9e5859e981abcb22a59c04662ca84f896ca40eacb068695e14 WatchSource:0}: Error finding container f77ad8a424e3ab9e5859e981abcb22a59c04662ca84f896ca40eacb068695e14: Status 404 returned error can't find the container with id f77ad8a424e3ab9e5859e981abcb22a59c04662ca84f896ca40eacb068695e14 Dec 02 20:16:53 crc kubenswrapper[4807]: W1202 20:16:53.079246 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6511dc8d_00b4_4937_a330_0f5cf9c06fdd.slice/crio-cd0260245a8141729b5227692aba9471b9aef837a3716dfb9321b37ef8454afd WatchSource:0}: Error finding container cd0260245a8141729b5227692aba9471b9aef837a3716dfb9321b37ef8454afd: Status 404 returned error can't find the container with id cd0260245a8141729b5227692aba9471b9aef837a3716dfb9321b37ef8454afd Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.082661 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vc4ht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-qrthg_openstack-operators(733f7038-d2b9-4047-8ee9-3ad49a55729d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.086049 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m"] Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.086147 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bhjxr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-gmwq8_openstack-operators(9198e60c-4301-40b6-9d1b-3e91a2f10fa5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.086438 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vc4ht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-qrthg_openstack-operators(733f7038-d2b9-4047-8ee9-3ad49a55729d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.086601 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vwbp2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-g8559_openstack-operators(6511dc8d-00b4-4937-a330-0f5cf9c06fdd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.088384 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bhjxr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-gmwq8_openstack-operators(9198e60c-4301-40b6-9d1b-3e91a2f10fa5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.088417 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qrthg" podUID="733f7038-d2b9-4047-8ee9-3ad49a55729d" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.090175 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.5:5001/openstack-k8s-operators/watcher-operator:0e562967e0a192baf562500f49cef0abd8c6f6ec,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8tbqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-58888ff59d-cwws5_openstack-operators(2a376983-2f33-465c-9781-391b67941e21): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.090279 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vwbp2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-g8559_openstack-operators(6511dc8d-00b4-4937-a330-0f5cf9c06fdd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.090336 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8" podUID="9198e60c-4301-40b6-9d1b-3e91a2f10fa5" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.091161 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dwx22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-52v8m_openstack-operators(490b3442-f4b4-493d-824a-67e370ac26f9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.091338 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5657t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-tzhtf_openstack-operators(8dd00692-0728-47cf-b8fa-ab812b11ec8f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.091424 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-g8559" podUID="6511dc8d-00b4-4937-a330-0f5cf9c06fdd" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.092008 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8tbqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-58888ff59d-cwws5_openstack-operators(2a376983-2f33-465c-9781-391b67941e21): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.093237 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5" podUID="2a376983-2f33-465c-9781-391b67941e21" Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.095710 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6mvz2"] Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.096029 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dwx22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-52v8m_openstack-operators(490b3442-f4b4-493d-824a-67e370ac26f9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.096071 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5657t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-tzhtf_openstack-operators(8dd00692-0728-47cf-b8fa-ab812b11ec8f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.097310 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf" podUID="8dd00692-0728-47cf-b8fa-ab812b11ec8f" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.097529 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m" podUID="490b3442-f4b4-493d-824a-67e370ac26f9" Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.101846 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-7pqn2"] Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.109586 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-g8559"] Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.115438 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-qrthg"] Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.122043 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf"] Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.129366 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-t8rj2"] Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.132647 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-4qmqw"] Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.136438 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5"] Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.447826 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-tz65v" event={"ID":"5ceaf50b-92b7-4069-b9b8-660e90c55d97","Type":"ContainerStarted","Data":"58c4e52fa4daf9e1cb13934ad49ac03ebef83873cf96a49721b3ac3925f5ea7d"} Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.449666 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5" event={"ID":"2a376983-2f33-465c-9781-391b67941e21","Type":"ContainerStarted","Data":"501b9eb77d4dc61288e619f560fc11810b932feb205586b30b865ba8c77efe17"} Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.452003 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9g547" event={"ID":"1f4141f5-ba14-4c49-b114-07e5d506b255","Type":"ContainerStarted","Data":"8606b4dd052c184ed6bb6b6f99e5e9c316270a86aad593c4eacdd8af09623a26"} Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.453888 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/watcher-operator:0e562967e0a192baf562500f49cef0abd8c6f6ec\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5" podUID="2a376983-2f33-465c-9781-391b67941e21" Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.455692 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gd4bj" event={"ID":"7cccb577-4849-4e1c-b38e-669f7658eb2e","Type":"ContainerStarted","Data":"b080a5cbfea1e757a5a8e6a59135b15d61750a19bcef391d12aa2011dd0242c2"} Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.458535 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-t8rj2" event={"ID":"3035086c-2661-4720-97b1-df4d0cd891a6","Type":"ContainerStarted","Data":"93ee4188427319138a62de1f3379868d2eeba3abc29e631239afce0bb91f11c7"} Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.462518 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-p2b2h" event={"ID":"357ab26f-5ac4-46a1-b8f3-89db969b4082","Type":"ContainerStarted","Data":"f1b879d410147624a8e0afac5cd1c6db04fbe6a712104de40e45e2778a08b573"} Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.476468 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m" event={"ID":"490b3442-f4b4-493d-824a-67e370ac26f9","Type":"ContainerStarted","Data":"214cdf38665d7febcb5c553c5ec4c1d48e02ab7aab3ed299d9ae68bbb2dc01f7"} Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.481238 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m" podUID="490b3442-f4b4-493d-824a-67e370ac26f9" Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.482111 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-wzdmc" event={"ID":"fa1a5516-5c0d-4d4e-b052-d9301371a2d3","Type":"ContainerStarted","Data":"f4ae8d56e2e69bae4c3bc54fc04a46a4b5b33f7825d5b278aee57d2c7cb722d5"} Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.484576 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qrthg" event={"ID":"733f7038-d2b9-4047-8ee9-3ad49a55729d","Type":"ContainerStarted","Data":"1defc9127e47c220d6e18c5f35d5649e011b8c5f70e55ebbb6afad6293b25df4"} Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.487473 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8" event={"ID":"9198e60c-4301-40b6-9d1b-3e91a2f10fa5","Type":"ContainerStarted","Data":"f77ad8a424e3ab9e5859e981abcb22a59c04662ca84f896ca40eacb068695e14"} Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.490792 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qrthg" podUID="733f7038-d2b9-4047-8ee9-3ad49a55729d" Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.491807 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4qmqw" event={"ID":"959355af-f6bd-492c-af58-9a7378224225","Type":"ContainerStarted","Data":"b913fddb49a40b249f46d37d84f38b1b5b4236202af1d18234f9a07c3c707015"} Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.492104 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8" podUID="9198e60c-4301-40b6-9d1b-3e91a2f10fa5" Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.502445 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-7pqn2" event={"ID":"b53c3dbf-2380-4dff-9b18-a2207efcce60","Type":"ContainerStarted","Data":"a8c37da67a4aeec2ad8ac2faee8c0bce8658a1d0b0446ed68da56bf16cfb47f6"} Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.504445 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nh4hg" event={"ID":"b2ef8498-337b-40c6-b122-19863c876321","Type":"ContainerStarted","Data":"7c259ce1bdd0727ce4c77679b91ef5f81de029acd96d96d4c23f6fd8edc60a15"} Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.508340 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-g8559" event={"ID":"6511dc8d-00b4-4937-a330-0f5cf9c06fdd","Type":"ContainerStarted","Data":"cd0260245a8141729b5227692aba9471b9aef837a3716dfb9321b37ef8454afd"} Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.510137 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf" event={"ID":"8dd00692-0728-47cf-b8fa-ab812b11ec8f","Type":"ContainerStarted","Data":"59ef7574ac7a4ea41e6ffb9a455343298e255e79c2b9f2ddcaf2ad5590b05520"} Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.516548 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kq246" event={"ID":"3c3d38aa-3600-41f6-97b3-e3699796526e","Type":"ContainerStarted","Data":"87904bfc6c5dfdebccf17b804ebb033c0ef41cf0b723ae16a9f656916cd2f076"} Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.519211 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6mvz2" event={"ID":"f376fe60-0cdf-4b30-ab61-80178d738ea4","Type":"ContainerStarted","Data":"bc2fa97eb0b89c1745ef65695c3a1f0e26dca3ba83c7ac60de8e004f6abd0305"} Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.519339 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf" podUID="8dd00692-0728-47cf-b8fa-ab812b11ec8f" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.519335 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-g8559" podUID="6511dc8d-00b4-4937-a330-0f5cf9c06fdd" Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.523747 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-mss4c" event={"ID":"cfb9049e-2275-4d96-9131-29bb4def714b","Type":"ContainerStarted","Data":"9b0ecaf6909ca6331360fb75345c43d13b302195f0c444596ca2f6d34eed0567"} Dec 02 20:16:53 crc kubenswrapper[4807]: I1202 20:16:53.981216 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert\") pod \"infra-operator-controller-manager-57548d458d-dn7v2\" (UID: \"d7712aec-0995-489a-8cee-7e68fbf130df\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.981423 4807 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 20:16:53 crc kubenswrapper[4807]: E1202 20:16:53.982128 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert podName:d7712aec-0995-489a-8cee-7e68fbf130df nodeName:}" failed. No retries permitted until 2025-12-02 20:16:57.982106364 +0000 UTC m=+1153.283013859 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert") pod "infra-operator-controller-manager-57548d458d-dn7v2" (UID: "d7712aec-0995-489a-8cee-7e68fbf130df") : secret "infra-operator-webhook-server-cert" not found Dec 02 20:16:54 crc kubenswrapper[4807]: E1202 20:16:54.538034 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-g8559" podUID="6511dc8d-00b4-4937-a330-0f5cf9c06fdd" Dec 02 20:16:54 crc kubenswrapper[4807]: E1202 20:16:54.538042 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf" podUID="8dd00692-0728-47cf-b8fa-ab812b11ec8f" Dec 02 20:16:54 crc kubenswrapper[4807]: E1202 20:16:54.538351 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qrthg" podUID="733f7038-d2b9-4047-8ee9-3ad49a55729d" Dec 02 20:16:54 crc kubenswrapper[4807]: E1202 20:16:54.538983 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/watcher-operator:0e562967e0a192baf562500f49cef0abd8c6f6ec\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5" podUID="2a376983-2f33-465c-9781-391b67941e21" Dec 02 20:16:54 crc kubenswrapper[4807]: E1202 20:16:54.549520 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8" podUID="9198e60c-4301-40b6-9d1b-3e91a2f10fa5" Dec 02 20:16:54 crc kubenswrapper[4807]: E1202 20:16:54.549414 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m" podUID="490b3442-f4b4-493d-824a-67e370ac26f9" Dec 02 20:16:54 crc kubenswrapper[4807]: I1202 20:16:54.595894 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk\" (UID: \"e0e35837-6389-4e86-b8c5-46105f1332cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" Dec 02 20:16:54 crc kubenswrapper[4807]: E1202 20:16:54.596428 4807 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 20:16:54 crc kubenswrapper[4807]: E1202 20:16:54.597047 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert podName:e0e35837-6389-4e86-b8c5-46105f1332cb nodeName:}" failed. No retries permitted until 2025-12-02 20:16:58.597002909 +0000 UTC m=+1153.897910404 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" (UID: "e0e35837-6389-4e86-b8c5-46105f1332cb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 20:16:55 crc kubenswrapper[4807]: I1202 20:16:55.109205 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:16:55 crc kubenswrapper[4807]: I1202 20:16:55.109336 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:16:55 crc kubenswrapper[4807]: E1202 20:16:55.109466 4807 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 20:16:55 crc kubenswrapper[4807]: E1202 20:16:55.109576 4807 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 20:16:55 crc kubenswrapper[4807]: E1202 20:16:55.109592 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs podName:4039c119-ee84-4043-8892-733499aabdc5 nodeName:}" failed. No retries permitted until 2025-12-02 20:16:59.10956731 +0000 UTC m=+1154.410474975 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs") pod "openstack-operator-controller-manager-57fb6dd487-lffvh" (UID: "4039c119-ee84-4043-8892-733499aabdc5") : secret "metrics-server-cert" not found Dec 02 20:16:55 crc kubenswrapper[4807]: E1202 20:16:55.109677 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs podName:4039c119-ee84-4043-8892-733499aabdc5 nodeName:}" failed. No retries permitted until 2025-12-02 20:16:59.109631142 +0000 UTC m=+1154.410538637 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs") pod "openstack-operator-controller-manager-57fb6dd487-lffvh" (UID: "4039c119-ee84-4043-8892-733499aabdc5") : secret "webhook-server-cert" not found Dec 02 20:16:58 crc kubenswrapper[4807]: I1202 20:16:58.055392 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert\") pod \"infra-operator-controller-manager-57548d458d-dn7v2\" (UID: \"d7712aec-0995-489a-8cee-7e68fbf130df\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" Dec 02 20:16:58 crc kubenswrapper[4807]: E1202 20:16:58.055640 4807 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 20:16:58 crc kubenswrapper[4807]: E1202 20:16:58.055913 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert podName:d7712aec-0995-489a-8cee-7e68fbf130df nodeName:}" failed. No retries permitted until 2025-12-02 20:17:06.055888385 +0000 UTC m=+1161.356795880 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert") pod "infra-operator-controller-manager-57548d458d-dn7v2" (UID: "d7712aec-0995-489a-8cee-7e68fbf130df") : secret "infra-operator-webhook-server-cert" not found Dec 02 20:16:58 crc kubenswrapper[4807]: I1202 20:16:58.293003 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:16:58 crc kubenswrapper[4807]: I1202 20:16:58.293074 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:16:58 crc kubenswrapper[4807]: I1202 20:16:58.665525 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk\" (UID: \"e0e35837-6389-4e86-b8c5-46105f1332cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" Dec 02 20:16:58 crc kubenswrapper[4807]: E1202 20:16:58.665805 4807 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 20:16:58 crc kubenswrapper[4807]: E1202 20:16:58.665953 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert podName:e0e35837-6389-4e86-b8c5-46105f1332cb nodeName:}" failed. No retries permitted until 2025-12-02 20:17:06.665931178 +0000 UTC m=+1161.966838673 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" (UID: "e0e35837-6389-4e86-b8c5-46105f1332cb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 20:16:59 crc kubenswrapper[4807]: I1202 20:16:59.172357 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:16:59 crc kubenswrapper[4807]: I1202 20:16:59.172462 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:16:59 crc kubenswrapper[4807]: E1202 20:16:59.172526 4807 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 20:16:59 crc kubenswrapper[4807]: E1202 20:16:59.172601 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs podName:4039c119-ee84-4043-8892-733499aabdc5 nodeName:}" failed. No retries permitted until 2025-12-02 20:17:07.172580335 +0000 UTC m=+1162.473487830 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs") pod "openstack-operator-controller-manager-57fb6dd487-lffvh" (UID: "4039c119-ee84-4043-8892-733499aabdc5") : secret "metrics-server-cert" not found Dec 02 20:16:59 crc kubenswrapper[4807]: E1202 20:16:59.172613 4807 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 20:16:59 crc kubenswrapper[4807]: E1202 20:16:59.172691 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs podName:4039c119-ee84-4043-8892-733499aabdc5 nodeName:}" failed. No retries permitted until 2025-12-02 20:17:07.172671448 +0000 UTC m=+1162.473578943 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs") pod "openstack-operator-controller-manager-57fb6dd487-lffvh" (UID: "4039c119-ee84-4043-8892-733499aabdc5") : secret "webhook-server-cert" not found Dec 02 20:17:05 crc kubenswrapper[4807]: E1202 20:17:05.956604 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d" Dec 02 20:17:05 crc kubenswrapper[4807]: E1202 20:17:05.957741 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2mnmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-7pqn2_openstack-operators(b53c3dbf-2380-4dff-9b18-a2207efcce60): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 20:17:06 crc kubenswrapper[4807]: I1202 20:17:06.099997 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert\") pod \"infra-operator-controller-manager-57548d458d-dn7v2\" (UID: \"d7712aec-0995-489a-8cee-7e68fbf130df\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" Dec 02 20:17:06 crc kubenswrapper[4807]: I1202 20:17:06.106824 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7712aec-0995-489a-8cee-7e68fbf130df-cert\") pod \"infra-operator-controller-manager-57548d458d-dn7v2\" (UID: \"d7712aec-0995-489a-8cee-7e68fbf130df\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" Dec 02 20:17:06 crc kubenswrapper[4807]: I1202 20:17:06.387073 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" Dec 02 20:17:06 crc kubenswrapper[4807]: I1202 20:17:06.711426 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk\" (UID: \"e0e35837-6389-4e86-b8c5-46105f1332cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" Dec 02 20:17:06 crc kubenswrapper[4807]: I1202 20:17:06.715578 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0e35837-6389-4e86-b8c5-46105f1332cb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk\" (UID: \"e0e35837-6389-4e86-b8c5-46105f1332cb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" Dec 02 20:17:06 crc kubenswrapper[4807]: I1202 20:17:06.864897 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" Dec 02 20:17:07 crc kubenswrapper[4807]: I1202 20:17:07.220204 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:17:07 crc kubenswrapper[4807]: I1202 20:17:07.220333 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:17:07 crc kubenswrapper[4807]: E1202 20:17:07.220443 4807 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 20:17:07 crc kubenswrapper[4807]: E1202 20:17:07.220480 4807 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 20:17:07 crc kubenswrapper[4807]: E1202 20:17:07.220545 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs podName:4039c119-ee84-4043-8892-733499aabdc5 nodeName:}" failed. No retries permitted until 2025-12-02 20:17:23.220525774 +0000 UTC m=+1178.521433269 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs") pod "openstack-operator-controller-manager-57fb6dd487-lffvh" (UID: "4039c119-ee84-4043-8892-733499aabdc5") : secret "metrics-server-cert" not found Dec 02 20:17:07 crc kubenswrapper[4807]: E1202 20:17:07.220565 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs podName:4039c119-ee84-4043-8892-733499aabdc5 nodeName:}" failed. No retries permitted until 2025-12-02 20:17:23.220556314 +0000 UTC m=+1178.521463809 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs") pod "openstack-operator-controller-manager-57fb6dd487-lffvh" (UID: "4039c119-ee84-4043-8892-733499aabdc5") : secret "webhook-server-cert" not found Dec 02 20:17:07 crc kubenswrapper[4807]: E1202 20:17:07.877888 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 02 20:17:07 crc kubenswrapper[4807]: E1202 20:17:07.878566 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wn5lr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-4qmqw_openstack-operators(959355af-f6bd-492c-af58-9a7378224225): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 20:17:21 crc kubenswrapper[4807]: E1202 20:17:21.725316 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 02 20:17:21 crc kubenswrapper[4807]: E1202 20:17:21.726561 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7mxml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-t8rj2_openstack-operators(3035086c-2661-4720-97b1-df4d0cd891a6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 20:17:21 crc kubenswrapper[4807]: E1202 20:17:21.728065 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-t8rj2" podUID="3035086c-2661-4720-97b1-df4d0cd891a6" Dec 02 20:17:21 crc kubenswrapper[4807]: E1202 20:17:21.798329 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-t8rj2" podUID="3035086c-2661-4720-97b1-df4d0cd891a6" Dec 02 20:17:22 crc kubenswrapper[4807]: E1202 20:17:22.018734 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:7d6ca59745ac48971cbc2d72b53fe413144fa5c0c21f2ef1d7aaf1291851e501: Get \"https://quay.io/v2/openstack-k8s-operators/nova-operator/blobs/sha256:7d6ca59745ac48971cbc2d72b53fe413144fa5c0c21f2ef1d7aaf1291851e501\": context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 02 20:17:22 crc kubenswrapper[4807]: E1202 20:17:22.019516 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vwbp2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-g8559_openstack-operators(6511dc8d-00b4-4937-a330-0f5cf9c06fdd): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:7d6ca59745ac48971cbc2d72b53fe413144fa5c0c21f2ef1d7aaf1291851e501: Get \"https://quay.io/v2/openstack-k8s-operators/nova-operator/blobs/sha256:7d6ca59745ac48971cbc2d72b53fe413144fa5c0c21f2ef1d7aaf1291851e501\": context canceled" logger="UnhandledError" Dec 02 20:17:22 crc kubenswrapper[4807]: I1202 20:17:22.307459 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2"] Dec 02 20:17:22 crc kubenswrapper[4807]: I1202 20:17:22.392853 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk"] Dec 02 20:17:22 crc kubenswrapper[4807]: I1202 20:17:22.807959 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-p2b2h" event={"ID":"357ab26f-5ac4-46a1-b8f3-89db969b4082","Type":"ContainerStarted","Data":"c6da1007797668cf26ac136bc364fae514c5bb916a46e0e6613844c34e77133c"} Dec 02 20:17:23 crc kubenswrapper[4807]: I1202 20:17:23.307418 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:17:23 crc kubenswrapper[4807]: I1202 20:17:23.309256 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:17:23 crc kubenswrapper[4807]: I1202 20:17:23.315592 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-metrics-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:17:23 crc kubenswrapper[4807]: I1202 20:17:23.324621 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4039c119-ee84-4043-8892-733499aabdc5-webhook-certs\") pod \"openstack-operator-controller-manager-57fb6dd487-lffvh\" (UID: \"4039c119-ee84-4043-8892-733499aabdc5\") " pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:17:23 crc kubenswrapper[4807]: I1202 20:17:23.348237 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:17:23 crc kubenswrapper[4807]: I1202 20:17:23.819955 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gd4bj" event={"ID":"7cccb577-4849-4e1c-b38e-669f7658eb2e","Type":"ContainerStarted","Data":"847765ce3064f8dc62645652fe5ddecc570c85a7cd73c52f118bb8d194278216"} Dec 02 20:17:23 crc kubenswrapper[4807]: I1202 20:17:23.822144 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" event={"ID":"d7712aec-0995-489a-8cee-7e68fbf130df","Type":"ContainerStarted","Data":"0aa8c3afb7b8f3a3e81e1d11081fd40376e7d1a91715b8d26cd6d85b9dca96df"} Dec 02 20:17:23 crc kubenswrapper[4807]: I1202 20:17:23.823324 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" event={"ID":"e0e35837-6389-4e86-b8c5-46105f1332cb","Type":"ContainerStarted","Data":"87f1148622a03ada9d6576bb15eac9b3aeefde1f1dc57b5bd0550059e78d928d"} Dec 02 20:17:24 crc kubenswrapper[4807]: I1202 20:17:24.866701 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-tz65v" event={"ID":"5ceaf50b-92b7-4069-b9b8-660e90c55d97","Type":"ContainerStarted","Data":"1bfa529427a38e6989bfb3b8b8d3159f8242c6a2c69a01f3ee742dcf140c94a7"} Dec 02 20:17:24 crc kubenswrapper[4807]: I1202 20:17:24.868092 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9g547" event={"ID":"1f4141f5-ba14-4c49-b114-07e5d506b255","Type":"ContainerStarted","Data":"0692a133b3e2a2348d9e01efde41269c0dedec65ac0a1b235377a8aa155de774"} Dec 02 20:17:24 crc kubenswrapper[4807]: I1202 20:17:24.869500 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6mvz2" event={"ID":"f376fe60-0cdf-4b30-ab61-80178d738ea4","Type":"ContainerStarted","Data":"4c0cf62f9c7c3d59ec4cfc9af08c1889f31c877f4b84735eef3a9ef8f1ba1bc0"} Dec 02 20:17:25 crc kubenswrapper[4807]: I1202 20:17:25.876960 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-mss4c" event={"ID":"cfb9049e-2275-4d96-9131-29bb4def714b","Type":"ContainerStarted","Data":"1de56359c34697e588e73358aae8a65a6cd40bb34b34ff6016867cd44b7b0d1e"} Dec 02 20:17:25 crc kubenswrapper[4807]: I1202 20:17:25.879641 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-wzdmc" event={"ID":"fa1a5516-5c0d-4d4e-b052-d9301371a2d3","Type":"ContainerStarted","Data":"55e60f98f6d60d93b154c9876703df0184eb4be72ba5f66188c0e3e6abb79335"} Dec 02 20:17:28 crc kubenswrapper[4807]: I1202 20:17:28.293233 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:17:28 crc kubenswrapper[4807]: I1202 20:17:28.293939 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:17:28 crc kubenswrapper[4807]: I1202 20:17:28.294002 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 20:17:28 crc kubenswrapper[4807]: I1202 20:17:28.295969 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"38f5416dfc921b3e8d35befa1ba790fe16c356edf6ecbb6687773608446c2497"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 20:17:28 crc kubenswrapper[4807]: I1202 20:17:28.296059 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://38f5416dfc921b3e8d35befa1ba790fe16c356edf6ecbb6687773608446c2497" gracePeriod=600 Dec 02 20:17:28 crc kubenswrapper[4807]: I1202 20:17:28.916708 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nh4hg" event={"ID":"b2ef8498-337b-40c6-b122-19863c876321","Type":"ContainerStarted","Data":"58e0434facdb28b427242c0ad7e47aaa7a082108677189162d45704855456ed1"} Dec 02 20:17:28 crc kubenswrapper[4807]: I1202 20:17:28.918478 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r9jtf" event={"ID":"7c007dd6-7efa-47c1-af56-ce0bf8fd6f37","Type":"ContainerStarted","Data":"ad5d03098964307b14d6cc770df33d4e5ed7991f9561ccdf65b0a1a319703e93"} Dec 02 20:17:28 crc kubenswrapper[4807]: I1202 20:17:28.921356 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="38f5416dfc921b3e8d35befa1ba790fe16c356edf6ecbb6687773608446c2497" exitCode=0 Dec 02 20:17:28 crc kubenswrapper[4807]: I1202 20:17:28.921412 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"38f5416dfc921b3e8d35befa1ba790fe16c356edf6ecbb6687773608446c2497"} Dec 02 20:17:28 crc kubenswrapper[4807]: I1202 20:17:28.921496 4807 scope.go:117] "RemoveContainer" containerID="e7e961064ac2bf5a3d6cced9f51594218b6b12f9bc4f97979713bb99fd8aad69" Dec 02 20:17:29 crc kubenswrapper[4807]: E1202 20:17:29.615646 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 20:17:29 crc kubenswrapper[4807]: E1202 20:17:29.616975 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2mnmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-7pqn2_openstack-operators(b53c3dbf-2380-4dff-9b18-a2207efcce60): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 02 20:17:29 crc kubenswrapper[4807]: E1202 20:17:29.618195 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-7pqn2" podUID="b53c3dbf-2380-4dff-9b18-a2207efcce60" Dec 02 20:17:31 crc kubenswrapper[4807]: I1202 20:17:31.947199 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g2lf7" event={"ID":"3982be8e-b5d2-4795-9312-f3ba8466209c","Type":"ContainerStarted","Data":"d7240a61c921685dc11a74766b76b84ad1bc552245906c0764a9d29e3809f175"} Dec 02 20:17:31 crc kubenswrapper[4807]: I1202 20:17:31.951499 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kq246" event={"ID":"3c3d38aa-3600-41f6-97b3-e3699796526e","Type":"ContainerStarted","Data":"080f5009ea030a58dfe6169f7abdc4f5d46a2d339b552f00602d87db2104ec7a"} Dec 02 20:17:32 crc kubenswrapper[4807]: I1202 20:17:32.030514 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh"] Dec 02 20:17:33 crc kubenswrapper[4807]: I1202 20:17:33.025037 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" event={"ID":"4039c119-ee84-4043-8892-733499aabdc5","Type":"ContainerStarted","Data":"ff615e4ad373e3194e3557f1dd46650ee15f43656eed2718dedba4739cfd21ff"} Dec 02 20:17:33 crc kubenswrapper[4807]: I1202 20:17:33.031450 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m" event={"ID":"490b3442-f4b4-493d-824a-67e370ac26f9","Type":"ContainerStarted","Data":"0ec690db98954e26f7e37e7e108508ae35363927ecf68ece20cf80394e650d27"} Dec 02 20:17:33 crc kubenswrapper[4807]: I1202 20:17:33.036414 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"13f662bee9488997accc2766f7577233b423e1195e39b433e12fc85d986a041b"} Dec 02 20:17:34 crc kubenswrapper[4807]: I1202 20:17:34.056310 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qrthg" event={"ID":"733f7038-d2b9-4047-8ee9-3ad49a55729d","Type":"ContainerStarted","Data":"e0a6d548754a43b8b47a36fd795976ef2eaef006dd2ebb26d409bc8a4872f85f"} Dec 02 20:17:34 crc kubenswrapper[4807]: I1202 20:17:34.058645 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf" event={"ID":"8dd00692-0728-47cf-b8fa-ab812b11ec8f","Type":"ContainerStarted","Data":"b0da514a9ad5541c09d5949aaa787d977c34390651d82da387b8bc5b3e74b09f"} Dec 02 20:17:34 crc kubenswrapper[4807]: I1202 20:17:34.061414 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" event={"ID":"4039c119-ee84-4043-8892-733499aabdc5","Type":"ContainerStarted","Data":"02b24b26cf9ec8f74bd94896bceba12fef4f971496cf45f5aaa795110aedd4d4"} Dec 02 20:17:34 crc kubenswrapper[4807]: I1202 20:17:34.061564 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:17:34 crc kubenswrapper[4807]: I1202 20:17:34.070063 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8" event={"ID":"9198e60c-4301-40b6-9d1b-3e91a2f10fa5","Type":"ContainerStarted","Data":"187183d9c48d30da032c088ab6c5b5d8f4804567ba03dd2827e31326369a4651"} Dec 02 20:17:34 crc kubenswrapper[4807]: I1202 20:17:34.096220 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" podStartSLOduration=44.096188871 podStartE2EDuration="44.096188871s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:17:34.087018162 +0000 UTC m=+1189.387925657" watchObservedRunningTime="2025-12-02 20:17:34.096188871 +0000 UTC m=+1189.397096366" Dec 02 20:17:35 crc kubenswrapper[4807]: I1202 20:17:35.105755 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" event={"ID":"d7712aec-0995-489a-8cee-7e68fbf130df","Type":"ContainerStarted","Data":"cb1eac55ce5106fb2e22f81767a2652f69266d60fa9dbe15150a7a53dfc272f5"} Dec 02 20:17:36 crc kubenswrapper[4807]: I1202 20:17:36.120587 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5" event={"ID":"2a376983-2f33-465c-9781-391b67941e21","Type":"ContainerStarted","Data":"b3d1dbf1128e92c251d590f45c75c899b3a8ffeb44ca61e284a9ede8746a7c08"} Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.163036 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" event={"ID":"e0e35837-6389-4e86-b8c5-46105f1332cb","Type":"ContainerStarted","Data":"053b94f9315e42f8f4d17615c5cdb189bbea36846622dd58d05a72b892539ae5"} Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.164604 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" event={"ID":"e0e35837-6389-4e86-b8c5-46105f1332cb","Type":"ContainerStarted","Data":"e8108fc248cad4f9d4fdd0e6ecb1602abcc953d473e9155f8a359b96ce12120d"} Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.165849 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.168004 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m" event={"ID":"490b3442-f4b4-493d-824a-67e370ac26f9","Type":"ContainerStarted","Data":"e26b2bcda9362a8cc62278c57243d8b349a1f14bc44d963ead5b9b785b919fe3"} Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.168930 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.171825 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.219095 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-7pqn2" event={"ID":"b53c3dbf-2380-4dff-9b18-a2207efcce60","Type":"ContainerStarted","Data":"f2914e4f692dcd5ab98f4189eea47f3fa01553fd64de2a167c71bb943983a8d2"} Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.219175 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-7pqn2" event={"ID":"b53c3dbf-2380-4dff-9b18-a2207efcce60","Type":"ContainerStarted","Data":"c35a77ea0a6ccf577238a4b2d9b61ecd0d22c2e45cac932bd644e9feb57fcf06"} Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.219448 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-7pqn2" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.221199 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" podStartSLOduration=37.9671026 podStartE2EDuration="47.221164292s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:17:23.049573683 +0000 UTC m=+1178.350481178" lastFinishedPulling="2025-12-02 20:17:32.303635375 +0000 UTC m=+1187.604542870" observedRunningTime="2025-12-02 20:17:37.218465433 +0000 UTC m=+1192.519372928" watchObservedRunningTime="2025-12-02 20:17:37.221164292 +0000 UTC m=+1192.522071787" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.222075 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g2lf7" event={"ID":"3982be8e-b5d2-4795-9312-f3ba8466209c","Type":"ContainerStarted","Data":"e3d3402db84cf13ef6d9de127024a44554eab0f223627735cdecd126472ec31a"} Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.222326 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g2lf7" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.225390 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r9jtf" event={"ID":"7c007dd6-7efa-47c1-af56-ce0bf8fd6f37","Type":"ContainerStarted","Data":"2045422c6e4bc77938dc2346b53a597e7d57f4e50c106d3e4171d9133dca2f55"} Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.225564 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r9jtf" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.227147 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8" event={"ID":"9198e60c-4301-40b6-9d1b-3e91a2f10fa5","Type":"ContainerStarted","Data":"3817f12be4f9b6a6226291928c07c23ad0c0162edbdcdec37eaba7e2621869c1"} Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.227645 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.228026 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g2lf7" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.232071 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r9jtf" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.233041 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6mvz2" event={"ID":"f376fe60-0cdf-4b30-ab61-80178d738ea4","Type":"ContainerStarted","Data":"1e073ee395acd63498ac5f3d0f57a1dec5bc4f62ab73024592c347a4edbc8359"} Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.233843 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6mvz2" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.235421 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6mvz2" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.254983 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-52v8m" podStartSLOduration=4.333525233 podStartE2EDuration="47.254959394s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:16:53.091044109 +0000 UTC m=+1148.391951604" lastFinishedPulling="2025-12-02 20:17:36.01247827 +0000 UTC m=+1191.313385765" observedRunningTime="2025-12-02 20:17:37.244771355 +0000 UTC m=+1192.545678850" watchObservedRunningTime="2025-12-02 20:17:37.254959394 +0000 UTC m=+1192.555866889" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.288614 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-6mvz2" podStartSLOduration=4.17642223 podStartE2EDuration="47.288591662s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:16:53.029372538 +0000 UTC m=+1148.330280023" lastFinishedPulling="2025-12-02 20:17:36.14154195 +0000 UTC m=+1191.442449455" observedRunningTime="2025-12-02 20:17:37.268372798 +0000 UTC m=+1192.569280293" watchObservedRunningTime="2025-12-02 20:17:37.288591662 +0000 UTC m=+1192.589499157" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.306589 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8" podStartSLOduration=4.34432568 podStartE2EDuration="47.30656584s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:16:53.08598075 +0000 UTC m=+1148.386888245" lastFinishedPulling="2025-12-02 20:17:36.0482209 +0000 UTC m=+1191.349128405" observedRunningTime="2025-12-02 20:17:37.30383956 +0000 UTC m=+1192.604747065" watchObservedRunningTime="2025-12-02 20:17:37.30656584 +0000 UTC m=+1192.607473335" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.332045 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-r9jtf" podStartSLOduration=3.796270233 podStartE2EDuration="48.332014377s" podCreationTimestamp="2025-12-02 20:16:49 +0000 UTC" firstStartedPulling="2025-12-02 20:16:51.69820569 +0000 UTC m=+1146.999113185" lastFinishedPulling="2025-12-02 20:17:36.233949834 +0000 UTC m=+1191.534857329" observedRunningTime="2025-12-02 20:17:37.328053061 +0000 UTC m=+1192.628960556" watchObservedRunningTime="2025-12-02 20:17:37.332014377 +0000 UTC m=+1192.632921872" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.393892 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-7pqn2" podStartSLOduration=7.615984897 podStartE2EDuration="47.393866183s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:16:53.046563332 +0000 UTC m=+1148.347470827" lastFinishedPulling="2025-12-02 20:17:32.824444618 +0000 UTC m=+1188.125352113" observedRunningTime="2025-12-02 20:17:37.366358495 +0000 UTC m=+1192.667266000" watchObservedRunningTime="2025-12-02 20:17:37.393866183 +0000 UTC m=+1192.694773678" Dec 02 20:17:37 crc kubenswrapper[4807]: I1202 20:17:37.404769 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g2lf7" podStartSLOduration=3.847779017 podStartE2EDuration="48.404696741s" podCreationTimestamp="2025-12-02 20:16:49 +0000 UTC" firstStartedPulling="2025-12-02 20:16:51.677848233 +0000 UTC m=+1146.978755728" lastFinishedPulling="2025-12-02 20:17:36.234765957 +0000 UTC m=+1191.535673452" observedRunningTime="2025-12-02 20:17:37.393298547 +0000 UTC m=+1192.694206042" watchObservedRunningTime="2025-12-02 20:17:37.404696741 +0000 UTC m=+1192.705604236" Dec 02 20:17:37 crc kubenswrapper[4807]: E1202 20:17:37.881303 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4qmqw" podUID="959355af-f6bd-492c-af58-9a7378224225" Dec 02 20:17:37 crc kubenswrapper[4807]: E1202 20:17:37.916352 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:7d6ca59745ac48971cbc2d72b53fe413144fa5c0c21f2ef1d7aaf1291851e501: Get \\\"https://quay.io/v2/openstack-k8s-operators/nova-operator/blobs/sha256:7d6ca59745ac48971cbc2d72b53fe413144fa5c0c21f2ef1d7aaf1291851e501\\\": context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-g8559" podUID="6511dc8d-00b4-4937-a330-0f5cf9c06fdd" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.269255 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qrthg" event={"ID":"733f7038-d2b9-4047-8ee9-3ad49a55729d","Type":"ContainerStarted","Data":"9478f77b1045fb47033104f1a39a7e766c288a5e1e438eef6e2d79c146736f66"} Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.270011 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qrthg" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.275621 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf" event={"ID":"8dd00692-0728-47cf-b8fa-ab812b11ec8f","Type":"ContainerStarted","Data":"7a47ea6c41b31bcec1f1f28ae5481218c5eb08342902057c0db37d04d674438e"} Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.276824 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.277735 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qrthg" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.282284 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nh4hg" event={"ID":"b2ef8498-337b-40c6-b122-19863c876321","Type":"ContainerStarted","Data":"c961af45ea242feee5c182e00349bafd6d76ec68ce366e693ae9202691979153"} Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.283254 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nh4hg" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.284450 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.289028 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nh4hg" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.299107 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gd4bj" event={"ID":"7cccb577-4849-4e1c-b38e-669f7658eb2e","Type":"ContainerStarted","Data":"ee7cc66b0c0e8974e1643185278ea9bce72877a8579df0f89a9e98672b640fc2"} Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.300107 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gd4bj" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.319844 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gd4bj" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.328006 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-mss4c" event={"ID":"cfb9049e-2275-4d96-9131-29bb4def714b","Type":"ContainerStarted","Data":"d84bbc8470505c0c25b218e20953dd68b8edde7ff19cd45c9de081961c38238f"} Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.329816 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-mss4c" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.331916 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-qrthg" podStartSLOduration=5.005797432 podStartE2EDuration="48.331889307s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:16:53.082527068 +0000 UTC m=+1148.383434563" lastFinishedPulling="2025-12-02 20:17:36.408618943 +0000 UTC m=+1191.709526438" observedRunningTime="2025-12-02 20:17:38.319167843 +0000 UTC m=+1193.620075338" watchObservedRunningTime="2025-12-02 20:17:38.331889307 +0000 UTC m=+1193.632796802" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.341138 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-mss4c" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.346784 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-p2b2h" event={"ID":"357ab26f-5ac4-46a1-b8f3-89db969b4082","Type":"ContainerStarted","Data":"3b98a8e27a3cc4235bfe4530eeceabc08aba249306d75acc4b8288609bc08e68"} Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.348131 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-p2b2h" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.356388 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-p2b2h" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.359163 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nh4hg" podStartSLOduration=4.856409566 podStartE2EDuration="48.359144277s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:16:53.022238248 +0000 UTC m=+1148.323145743" lastFinishedPulling="2025-12-02 20:17:36.524972959 +0000 UTC m=+1191.825880454" observedRunningTime="2025-12-02 20:17:38.346776864 +0000 UTC m=+1193.647684359" watchObservedRunningTime="2025-12-02 20:17:38.359144277 +0000 UTC m=+1193.660051762" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.360180 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5" event={"ID":"2a376983-2f33-465c-9781-391b67941e21","Type":"ContainerStarted","Data":"750e3a4b99989f237c3b78bc6187ea47668c91cec1410a4f205fa81f430057b0"} Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.360879 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.368661 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9g547" event={"ID":"1f4141f5-ba14-4c49-b114-07e5d506b255","Type":"ContainerStarted","Data":"fe58dba581650e9b0d54b6c458a1e9be2e9088f5346c14fea766099e4dbbc93d"} Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.369780 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9g547" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.371292 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-wzdmc" event={"ID":"fa1a5516-5c0d-4d4e-b052-d9301371a2d3","Type":"ContainerStarted","Data":"e804e21138456ecf7355783900bced0a7f56f536560e83dcd708da937848b9a0"} Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.371835 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-wzdmc" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.377759 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-wzdmc" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.377808 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9g547" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.378581 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" event={"ID":"d7712aec-0995-489a-8cee-7e68fbf130df","Type":"ContainerStarted","Data":"3c5df33dbf9a0dc21aa6922211a96d660854c36022387b0c224d5e0fdce4a450"} Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.390638 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.433782 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kq246" event={"ID":"3c3d38aa-3600-41f6-97b3-e3699796526e","Type":"ContainerStarted","Data":"d62d7658649f536603c701aeb3199094dead37f5745bdaf0cb26a28c4900c354"} Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.434970 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kq246" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.435846 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4qmqw" event={"ID":"959355af-f6bd-492c-af58-9a7378224225","Type":"ContainerStarted","Data":"92510272ddcc2af23ea09c6e8f9ab7da4d977d7f181af31624b59d76d92e5752"} Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.436526 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-tzhtf" podStartSLOduration=4.862449864 podStartE2EDuration="48.436510099s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:16:53.091259505 +0000 UTC m=+1148.392167000" lastFinishedPulling="2025-12-02 20:17:36.66531974 +0000 UTC m=+1191.966227235" observedRunningTime="2025-12-02 20:17:38.391120056 +0000 UTC m=+1193.692027551" watchObservedRunningTime="2025-12-02 20:17:38.436510099 +0000 UTC m=+1193.737417594" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.477735 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-g8559" event={"ID":"6511dc8d-00b4-4937-a330-0f5cf9c06fdd","Type":"ContainerStarted","Data":"432c0427bb29cebc8267e992e01cbf1d8862d242a3af8da8e803b1d75fc03974"} Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.482263 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-gd4bj" podStartSLOduration=4.753064772 podStartE2EDuration="48.482241392s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:16:52.973455396 +0000 UTC m=+1148.274362891" lastFinishedPulling="2025-12-02 20:17:36.702632016 +0000 UTC m=+1192.003539511" observedRunningTime="2025-12-02 20:17:38.433450219 +0000 UTC m=+1193.734357724" watchObservedRunningTime="2025-12-02 20:17:38.482241392 +0000 UTC m=+1193.783148887" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.487218 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kq246" Dec 02 20:17:38 crc kubenswrapper[4807]: E1202 20:17:38.495424 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-g8559" podUID="6511dc8d-00b4-4937-a330-0f5cf9c06fdd" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.504286 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-tz65v" event={"ID":"5ceaf50b-92b7-4069-b9b8-660e90c55d97","Type":"ContainerStarted","Data":"37be6e09f96ca7d42b2a44899a04e7cea2ec747795507fcc5ab21f37c3b727bf"} Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.515053 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-tz65v" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.531369 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-tz65v" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.609193 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-tz65v" podStartSLOduration=4.697801289 podStartE2EDuration="48.609166229s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:16:52.85609805 +0000 UTC m=+1148.157005545" lastFinishedPulling="2025-12-02 20:17:36.76746299 +0000 UTC m=+1192.068370485" observedRunningTime="2025-12-02 20:17:38.580650442 +0000 UTC m=+1193.881557937" watchObservedRunningTime="2025-12-02 20:17:38.609166229 +0000 UTC m=+1193.910073724" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.621523 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-wzdmc" podStartSLOduration=5.385427671 podStartE2EDuration="48.621500921s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:16:52.998689827 +0000 UTC m=+1148.299597332" lastFinishedPulling="2025-12-02 20:17:36.234763067 +0000 UTC m=+1191.535670582" observedRunningTime="2025-12-02 20:17:38.616461063 +0000 UTC m=+1193.917368558" watchObservedRunningTime="2025-12-02 20:17:38.621500921 +0000 UTC m=+1193.922408416" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.735808 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" podStartSLOduration=35.229853674 podStartE2EDuration="48.735766287s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:17:23.050078807 +0000 UTC m=+1178.350986302" lastFinishedPulling="2025-12-02 20:17:36.55599142 +0000 UTC m=+1191.856898915" observedRunningTime="2025-12-02 20:17:38.696530564 +0000 UTC m=+1193.997438059" watchObservedRunningTime="2025-12-02 20:17:38.735766287 +0000 UTC m=+1194.036673782" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.773331 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9g547" podStartSLOduration=5.48619064 podStartE2EDuration="48.773307959s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:16:53.037379253 +0000 UTC m=+1148.338286748" lastFinishedPulling="2025-12-02 20:17:36.324496572 +0000 UTC m=+1191.625404067" observedRunningTime="2025-12-02 20:17:38.739491886 +0000 UTC m=+1194.040399381" watchObservedRunningTime="2025-12-02 20:17:38.773307959 +0000 UTC m=+1194.074215454" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.806899 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5" podStartSLOduration=9.591229388 podStartE2EDuration="48.806867364s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:16:53.09006769 +0000 UTC m=+1148.390975185" lastFinishedPulling="2025-12-02 20:17:32.305705676 +0000 UTC m=+1187.606613161" observedRunningTime="2025-12-02 20:17:38.798486398 +0000 UTC m=+1194.099393893" watchObservedRunningTime="2025-12-02 20:17:38.806867364 +0000 UTC m=+1194.107774849" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.820181 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-mss4c" podStartSLOduration=6.023555115 podStartE2EDuration="49.820150734s" podCreationTimestamp="2025-12-02 20:16:49 +0000 UTC" firstStartedPulling="2025-12-02 20:16:52.438082066 +0000 UTC m=+1147.738989561" lastFinishedPulling="2025-12-02 20:17:36.234677685 +0000 UTC m=+1191.535585180" observedRunningTime="2025-12-02 20:17:38.819072943 +0000 UTC m=+1194.119980438" watchObservedRunningTime="2025-12-02 20:17:38.820150734 +0000 UTC m=+1194.121058229" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.841837 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-kq246" podStartSLOduration=6.205696743 podStartE2EDuration="49.84181611s" podCreationTimestamp="2025-12-02 20:16:49 +0000 UTC" firstStartedPulling="2025-12-02 20:16:52.918792721 +0000 UTC m=+1148.219700216" lastFinishedPulling="2025-12-02 20:17:36.554912088 +0000 UTC m=+1191.855819583" observedRunningTime="2025-12-02 20:17:38.839104901 +0000 UTC m=+1194.140012406" watchObservedRunningTime="2025-12-02 20:17:38.84181611 +0000 UTC m=+1194.142723605" Dec 02 20:17:38 crc kubenswrapper[4807]: I1202 20:17:38.886999 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-p2b2h" podStartSLOduration=6.846698716 podStartE2EDuration="49.886975777s" podCreationTimestamp="2025-12-02 20:16:49 +0000 UTC" firstStartedPulling="2025-12-02 20:16:52.972887859 +0000 UTC m=+1148.273795354" lastFinishedPulling="2025-12-02 20:17:36.01316492 +0000 UTC m=+1191.314072415" observedRunningTime="2025-12-02 20:17:38.884664409 +0000 UTC m=+1194.185571914" watchObservedRunningTime="2025-12-02 20:17:38.886975777 +0000 UTC m=+1194.187883272" Dec 02 20:17:39 crc kubenswrapper[4807]: I1202 20:17:39.513754 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-t8rj2" event={"ID":"3035086c-2661-4720-97b1-df4d0cd891a6","Type":"ContainerStarted","Data":"4b04b348a17bfa50c008394a6bc76ea73683408af96a93538b092e8e52a28508"} Dec 02 20:17:39 crc kubenswrapper[4807]: I1202 20:17:39.518328 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4qmqw" event={"ID":"959355af-f6bd-492c-af58-9a7378224225","Type":"ContainerStarted","Data":"cad084500e0a290e90bf4464d36b5f78144c362b5bc35cd0ba99930cdbbafa57"} Dec 02 20:17:39 crc kubenswrapper[4807]: I1202 20:17:39.520879 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4qmqw" Dec 02 20:17:39 crc kubenswrapper[4807]: I1202 20:17:39.524941 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gmwq8" Dec 02 20:17:39 crc kubenswrapper[4807]: I1202 20:17:39.528551 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-dn7v2" Dec 02 20:17:39 crc kubenswrapper[4807]: I1202 20:17:39.539541 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-t8rj2" podStartSLOduration=4.456555734 podStartE2EDuration="49.539517496s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:16:53.068843897 +0000 UTC m=+1148.369751382" lastFinishedPulling="2025-12-02 20:17:38.151805649 +0000 UTC m=+1193.452713144" observedRunningTime="2025-12-02 20:17:39.536072465 +0000 UTC m=+1194.836979970" watchObservedRunningTime="2025-12-02 20:17:39.539517496 +0000 UTC m=+1194.840425001" Dec 02 20:17:39 crc kubenswrapper[4807]: I1202 20:17:39.592308 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4qmqw" podStartSLOduration=3.586621399 podStartE2EDuration="49.592282346s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:16:53.068801015 +0000 UTC m=+1148.369708510" lastFinishedPulling="2025-12-02 20:17:39.074461972 +0000 UTC m=+1194.375369457" observedRunningTime="2025-12-02 20:17:39.590270406 +0000 UTC m=+1194.891177911" watchObservedRunningTime="2025-12-02 20:17:39.592282346 +0000 UTC m=+1194.893189871" Dec 02 20:17:41 crc kubenswrapper[4807]: I1202 20:17:41.229627 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-7pqn2" Dec 02 20:17:41 crc kubenswrapper[4807]: I1202 20:17:41.330553 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-58888ff59d-cwws5" Dec 02 20:17:43 crc kubenswrapper[4807]: I1202 20:17:43.358267 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-57fb6dd487-lffvh" Dec 02 20:17:46 crc kubenswrapper[4807]: I1202 20:17:46.874829 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk" Dec 02 20:17:51 crc kubenswrapper[4807]: I1202 20:17:51.246084 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-4qmqw" Dec 02 20:17:56 crc kubenswrapper[4807]: I1202 20:17:56.708567 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-g8559" event={"ID":"6511dc8d-00b4-4937-a330-0f5cf9c06fdd","Type":"ContainerStarted","Data":"c6d5d88bec3628583c66997e806137261f7f76dd057e796da3102dbced2284a1"} Dec 02 20:17:56 crc kubenswrapper[4807]: I1202 20:17:56.709693 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-g8559" Dec 02 20:17:56 crc kubenswrapper[4807]: I1202 20:17:56.735979 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-g8559" podStartSLOduration=4.031631118 podStartE2EDuration="1m6.735953478s" podCreationTimestamp="2025-12-02 20:16:50 +0000 UTC" firstStartedPulling="2025-12-02 20:16:53.086232297 +0000 UTC m=+1148.387139792" lastFinishedPulling="2025-12-02 20:17:55.790554617 +0000 UTC m=+1211.091462152" observedRunningTime="2025-12-02 20:17:56.730171358 +0000 UTC m=+1212.031078863" watchObservedRunningTime="2025-12-02 20:17:56.735953478 +0000 UTC m=+1212.036860973" Dec 02 20:18:00 crc kubenswrapper[4807]: I1202 20:18:00.787210 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-g8559" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.346697 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n7vn5"] Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.354662 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-n7vn5" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.361548 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.361842 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.361998 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.362142 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-tjqvm" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.370441 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n7vn5"] Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.428868 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ab33d4-57ad-4b4e-80b7-b93c947b8f41-config\") pod \"dnsmasq-dns-675f4bcbfc-n7vn5\" (UID: \"97ab33d4-57ad-4b4e-80b7-b93c947b8f41\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n7vn5" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.428945 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5xk4\" (UniqueName: \"kubernetes.io/projected/97ab33d4-57ad-4b4e-80b7-b93c947b8f41-kube-api-access-h5xk4\") pod \"dnsmasq-dns-675f4bcbfc-n7vn5\" (UID: \"97ab33d4-57ad-4b4e-80b7-b93c947b8f41\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n7vn5" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.443619 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qzrxh"] Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.446961 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qzrxh" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.451021 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qzrxh"] Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.451072 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.531058 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3496728f-063f-4e10-85d4-003a3b64bbe7-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qzrxh\" (UID: \"3496728f-063f-4e10-85d4-003a3b64bbe7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qzrxh" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.531591 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3496728f-063f-4e10-85d4-003a3b64bbe7-config\") pod \"dnsmasq-dns-78dd6ddcc-qzrxh\" (UID: \"3496728f-063f-4e10-85d4-003a3b64bbe7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qzrxh" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.531640 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xts95\" (UniqueName: \"kubernetes.io/projected/3496728f-063f-4e10-85d4-003a3b64bbe7-kube-api-access-xts95\") pod \"dnsmasq-dns-78dd6ddcc-qzrxh\" (UID: \"3496728f-063f-4e10-85d4-003a3b64bbe7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qzrxh" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.531684 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ab33d4-57ad-4b4e-80b7-b93c947b8f41-config\") pod \"dnsmasq-dns-675f4bcbfc-n7vn5\" (UID: \"97ab33d4-57ad-4b4e-80b7-b93c947b8f41\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n7vn5" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.531708 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5xk4\" (UniqueName: \"kubernetes.io/projected/97ab33d4-57ad-4b4e-80b7-b93c947b8f41-kube-api-access-h5xk4\") pod \"dnsmasq-dns-675f4bcbfc-n7vn5\" (UID: \"97ab33d4-57ad-4b4e-80b7-b93c947b8f41\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n7vn5" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.532555 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ab33d4-57ad-4b4e-80b7-b93c947b8f41-config\") pod \"dnsmasq-dns-675f4bcbfc-n7vn5\" (UID: \"97ab33d4-57ad-4b4e-80b7-b93c947b8f41\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n7vn5" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.551831 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5xk4\" (UniqueName: \"kubernetes.io/projected/97ab33d4-57ad-4b4e-80b7-b93c947b8f41-kube-api-access-h5xk4\") pod \"dnsmasq-dns-675f4bcbfc-n7vn5\" (UID: \"97ab33d4-57ad-4b4e-80b7-b93c947b8f41\") " pod="openstack/dnsmasq-dns-675f4bcbfc-n7vn5" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.633539 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3496728f-063f-4e10-85d4-003a3b64bbe7-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qzrxh\" (UID: \"3496728f-063f-4e10-85d4-003a3b64bbe7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qzrxh" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.633590 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3496728f-063f-4e10-85d4-003a3b64bbe7-config\") pod \"dnsmasq-dns-78dd6ddcc-qzrxh\" (UID: \"3496728f-063f-4e10-85d4-003a3b64bbe7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qzrxh" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.633628 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xts95\" (UniqueName: \"kubernetes.io/projected/3496728f-063f-4e10-85d4-003a3b64bbe7-kube-api-access-xts95\") pod \"dnsmasq-dns-78dd6ddcc-qzrxh\" (UID: \"3496728f-063f-4e10-85d4-003a3b64bbe7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qzrxh" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.634563 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3496728f-063f-4e10-85d4-003a3b64bbe7-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qzrxh\" (UID: \"3496728f-063f-4e10-85d4-003a3b64bbe7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qzrxh" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.634987 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3496728f-063f-4e10-85d4-003a3b64bbe7-config\") pod \"dnsmasq-dns-78dd6ddcc-qzrxh\" (UID: \"3496728f-063f-4e10-85d4-003a3b64bbe7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qzrxh" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.656059 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xts95\" (UniqueName: \"kubernetes.io/projected/3496728f-063f-4e10-85d4-003a3b64bbe7-kube-api-access-xts95\") pod \"dnsmasq-dns-78dd6ddcc-qzrxh\" (UID: \"3496728f-063f-4e10-85d4-003a3b64bbe7\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qzrxh" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.703210 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-n7vn5" Dec 02 20:18:20 crc kubenswrapper[4807]: I1202 20:18:20.764138 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qzrxh" Dec 02 20:18:21 crc kubenswrapper[4807]: I1202 20:18:21.200572 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n7vn5"] Dec 02 20:18:21 crc kubenswrapper[4807]: I1202 20:18:21.291071 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qzrxh"] Dec 02 20:18:21 crc kubenswrapper[4807]: W1202 20:18:21.293117 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3496728f_063f_4e10_85d4_003a3b64bbe7.slice/crio-0e6d5cdf3bb3094599b0aaae623b06493cb5f25c9b30e13ab4fb996e4982c1e3 WatchSource:0}: Error finding container 0e6d5cdf3bb3094599b0aaae623b06493cb5f25c9b30e13ab4fb996e4982c1e3: Status 404 returned error can't find the container with id 0e6d5cdf3bb3094599b0aaae623b06493cb5f25c9b30e13ab4fb996e4982c1e3 Dec 02 20:18:21 crc kubenswrapper[4807]: I1202 20:18:21.944644 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-n7vn5" event={"ID":"97ab33d4-57ad-4b4e-80b7-b93c947b8f41","Type":"ContainerStarted","Data":"4df17332ead7406bdf0163dfd744f576932c741b27ace1dea5056b061523d317"} Dec 02 20:18:21 crc kubenswrapper[4807]: I1202 20:18:21.946623 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-qzrxh" event={"ID":"3496728f-063f-4e10-85d4-003a3b64bbe7","Type":"ContainerStarted","Data":"0e6d5cdf3bb3094599b0aaae623b06493cb5f25c9b30e13ab4fb996e4982c1e3"} Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.310000 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n7vn5"] Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.335506 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nnbmr"] Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.343524 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.379218 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nnbmr"] Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.490455 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c99b3b4-e736-49f9-a876-7c72a2cc021c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-nnbmr\" (UID: \"1c99b3b4-e736-49f9-a876-7c72a2cc021c\") " pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.490892 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tdtl\" (UniqueName: \"kubernetes.io/projected/1c99b3b4-e736-49f9-a876-7c72a2cc021c-kube-api-access-4tdtl\") pod \"dnsmasq-dns-666b6646f7-nnbmr\" (UID: \"1c99b3b4-e736-49f9-a876-7c72a2cc021c\") " pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.490943 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c99b3b4-e736-49f9-a876-7c72a2cc021c-config\") pod \"dnsmasq-dns-666b6646f7-nnbmr\" (UID: \"1c99b3b4-e736-49f9-a876-7c72a2cc021c\") " pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.592027 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c99b3b4-e736-49f9-a876-7c72a2cc021c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-nnbmr\" (UID: \"1c99b3b4-e736-49f9-a876-7c72a2cc021c\") " pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.592090 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tdtl\" (UniqueName: \"kubernetes.io/projected/1c99b3b4-e736-49f9-a876-7c72a2cc021c-kube-api-access-4tdtl\") pod \"dnsmasq-dns-666b6646f7-nnbmr\" (UID: \"1c99b3b4-e736-49f9-a876-7c72a2cc021c\") " pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.592131 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c99b3b4-e736-49f9-a876-7c72a2cc021c-config\") pod \"dnsmasq-dns-666b6646f7-nnbmr\" (UID: \"1c99b3b4-e736-49f9-a876-7c72a2cc021c\") " pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.593095 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c99b3b4-e736-49f9-a876-7c72a2cc021c-config\") pod \"dnsmasq-dns-666b6646f7-nnbmr\" (UID: \"1c99b3b4-e736-49f9-a876-7c72a2cc021c\") " pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.593660 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c99b3b4-e736-49f9-a876-7c72a2cc021c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-nnbmr\" (UID: \"1c99b3b4-e736-49f9-a876-7c72a2cc021c\") " pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.644399 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tdtl\" (UniqueName: \"kubernetes.io/projected/1c99b3b4-e736-49f9-a876-7c72a2cc021c-kube-api-access-4tdtl\") pod \"dnsmasq-dns-666b6646f7-nnbmr\" (UID: \"1c99b3b4-e736-49f9-a876-7c72a2cc021c\") " pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.677278 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.739085 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qzrxh"] Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.764084 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cpgvr"] Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.765491 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.798614 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cpgvr"] Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.903003 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-config\") pod \"dnsmasq-dns-57d769cc4f-cpgvr\" (UID: \"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de\") " pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.903379 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-cpgvr\" (UID: \"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de\") " pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" Dec 02 20:18:23 crc kubenswrapper[4807]: I1202 20:18:23.903414 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhmgq\" (UniqueName: \"kubernetes.io/projected/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-kube-api-access-qhmgq\") pod \"dnsmasq-dns-57d769cc4f-cpgvr\" (UID: \"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de\") " pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.013797 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-config\") pod \"dnsmasq-dns-57d769cc4f-cpgvr\" (UID: \"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de\") " pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.013849 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-cpgvr\" (UID: \"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de\") " pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.013885 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhmgq\" (UniqueName: \"kubernetes.io/projected/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-kube-api-access-qhmgq\") pod \"dnsmasq-dns-57d769cc4f-cpgvr\" (UID: \"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de\") " pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.015109 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-cpgvr\" (UID: \"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de\") " pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.015608 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-config\") pod \"dnsmasq-dns-57d769cc4f-cpgvr\" (UID: \"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de\") " pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.046648 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhmgq\" (UniqueName: \"kubernetes.io/projected/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-kube-api-access-qhmgq\") pod \"dnsmasq-dns-57d769cc4f-cpgvr\" (UID: \"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de\") " pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.146802 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.249123 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nnbmr"] Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.561914 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.563945 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.568344 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.568416 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-p86rf" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.568344 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.568969 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.569076 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.569198 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.571503 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.595020 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.621072 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cpgvr"] Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.734671 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef1b0038-6b67-47e8-92e8-06efe88df856-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.734785 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.734819 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.734843 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vczf7\" (UniqueName: \"kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-kube-api-access-vczf7\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.734875 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.734896 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-config-data\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.734920 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.734937 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef1b0038-6b67-47e8-92e8-06efe88df856-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.734963 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.734988 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.735017 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.840071 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.840154 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.840192 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vczf7\" (UniqueName: \"kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-kube-api-access-vczf7\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.840255 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.840326 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-config-data\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.840375 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.840395 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef1b0038-6b67-47e8-92e8-06efe88df856-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.840429 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.840454 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.840768 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.841183 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.840501 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.841523 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef1b0038-6b67-47e8-92e8-06efe88df856-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.841698 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.841834 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.841844 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.841896 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-config-data\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.847231 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.847250 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef1b0038-6b67-47e8-92e8-06efe88df856-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.848822 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef1b0038-6b67-47e8-92e8-06efe88df856-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.859924 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vczf7\" (UniqueName: \"kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-kube-api-access-vczf7\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.893079 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.896949 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.900946 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-nrxx2" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.901421 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.907267 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.908289 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.909706 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.910582 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.911375 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.926531 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.930161 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:24 crc kubenswrapper[4807]: I1202 20:18:24.960260 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " pod="openstack/rabbitmq-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.039243 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" event={"ID":"1c99b3b4-e736-49f9-a876-7c72a2cc021c","Type":"ContainerStarted","Data":"a28fbf5e1a34b7f6e9cc7173865cdcd9c9b5c065c7098658726143c36da04c2c"} Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.039791 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" event={"ID":"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de","Type":"ContainerStarted","Data":"a7aa50202b82142f653d9e6602bda689faecaf032754478bb11d67ef4bceaa88"} Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.058612 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.060885 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.060986 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.061039 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.061102 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl7c9\" (UniqueName: \"kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-kube-api-access-xl7c9\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.061200 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.061220 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.062309 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.063080 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.063112 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.063153 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.164618 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.164736 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.164768 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.164809 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.165333 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.165666 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.166576 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.167936 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl7c9\" (UniqueName: \"kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-kube-api-access-xl7c9\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.169050 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.169112 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.170448 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.170597 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.170655 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.170822 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.170877 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.171567 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.171939 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.172192 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.172609 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.174485 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.175567 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.195687 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.201662 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.203002 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl7c9\" (UniqueName: \"kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-kube-api-access-xl7c9\") pod \"rabbitmq-cell1-server-0\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.231698 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.945538 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 20:18:25 crc kubenswrapper[4807]: I1202 20:18:25.959044 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.092954 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.097823 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.100463 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.103405 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.103687 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-p6zvb" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.104557 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.110990 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.118018 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.290975 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfzjx\" (UniqueName: \"kubernetes.io/projected/df8d83f3-6675-416b-a039-2aafac45fe18-kube-api-access-bfzjx\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.291042 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.291065 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8d83f3-6675-416b-a039-2aafac45fe18-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.291085 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df8d83f3-6675-416b-a039-2aafac45fe18-operator-scripts\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.291122 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/df8d83f3-6675-416b-a039-2aafac45fe18-config-data-default\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.291141 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/df8d83f3-6675-416b-a039-2aafac45fe18-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.291226 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/df8d83f3-6675-416b-a039-2aafac45fe18-config-data-generated\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.291294 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/df8d83f3-6675-416b-a039-2aafac45fe18-kolla-config\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.423881 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/df8d83f3-6675-416b-a039-2aafac45fe18-kolla-config\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.423959 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfzjx\" (UniqueName: \"kubernetes.io/projected/df8d83f3-6675-416b-a039-2aafac45fe18-kube-api-access-bfzjx\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.423990 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.424011 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8d83f3-6675-416b-a039-2aafac45fe18-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.424032 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df8d83f3-6675-416b-a039-2aafac45fe18-operator-scripts\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.424080 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/df8d83f3-6675-416b-a039-2aafac45fe18-config-data-default\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.424110 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/df8d83f3-6675-416b-a039-2aafac45fe18-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.424168 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/df8d83f3-6675-416b-a039-2aafac45fe18-config-data-generated\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.424660 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/df8d83f3-6675-416b-a039-2aafac45fe18-config-data-generated\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.425830 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/df8d83f3-6675-416b-a039-2aafac45fe18-config-data-default\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.426286 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df8d83f3-6675-416b-a039-2aafac45fe18-operator-scripts\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.426551 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.426961 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/df8d83f3-6675-416b-a039-2aafac45fe18-kolla-config\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.432801 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/df8d83f3-6675-416b-a039-2aafac45fe18-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.435686 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8d83f3-6675-416b-a039-2aafac45fe18-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.453352 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfzjx\" (UniqueName: \"kubernetes.io/projected/df8d83f3-6675-416b-a039-2aafac45fe18-kube-api-access-bfzjx\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.494957 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"df8d83f3-6675-416b-a039-2aafac45fe18\") " pod="openstack/openstack-galera-0" Dec 02 20:18:26 crc kubenswrapper[4807]: I1202 20:18:26.732515 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.380432 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.383448 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.390692 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-g8v5b" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.392492 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.393109 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.393454 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.393770 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.581231 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/96ba9206-497c-4cd1-a16d-436d2ba285a7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.581284 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.581322 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjpff\" (UniqueName: \"kubernetes.io/projected/96ba9206-497c-4cd1-a16d-436d2ba285a7-kube-api-access-cjpff\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.581350 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ba9206-497c-4cd1-a16d-436d2ba285a7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.581378 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ba9206-497c-4cd1-a16d-436d2ba285a7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.581397 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/96ba9206-497c-4cd1-a16d-436d2ba285a7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.581418 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96ba9206-497c-4cd1-a16d-436d2ba285a7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.581445 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/96ba9206-497c-4cd1-a16d-436d2ba285a7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.658815 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.661337 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.672467 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.674629 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-7p8j4" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.676215 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.683329 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/96ba9206-497c-4cd1-a16d-436d2ba285a7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.683405 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.683444 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjpff\" (UniqueName: \"kubernetes.io/projected/96ba9206-497c-4cd1-a16d-436d2ba285a7-kube-api-access-cjpff\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.683483 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ba9206-497c-4cd1-a16d-436d2ba285a7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.683508 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ba9206-497c-4cd1-a16d-436d2ba285a7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.683531 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/96ba9206-497c-4cd1-a16d-436d2ba285a7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.683560 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96ba9206-497c-4cd1-a16d-436d2ba285a7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.683596 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/96ba9206-497c-4cd1-a16d-436d2ba285a7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.684436 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.684650 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/96ba9206-497c-4cd1-a16d-436d2ba285a7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.684656 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.684991 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/96ba9206-497c-4cd1-a16d-436d2ba285a7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.685531 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/96ba9206-497c-4cd1-a16d-436d2ba285a7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.687561 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96ba9206-497c-4cd1-a16d-436d2ba285a7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.689262 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ba9206-497c-4cd1-a16d-436d2ba285a7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.696155 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/96ba9206-497c-4cd1-a16d-436d2ba285a7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.705929 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjpff\" (UniqueName: \"kubernetes.io/projected/96ba9206-497c-4cd1-a16d-436d2ba285a7-kube-api-access-cjpff\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.734426 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"96ba9206-497c-4cd1-a16d-436d2ba285a7\") " pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.788790 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4225655b-c174-479e-a740-b768c9801287-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4225655b-c174-479e-a740-b768c9801287\") " pod="openstack/memcached-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.788900 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4225655b-c174-479e-a740-b768c9801287-kolla-config\") pod \"memcached-0\" (UID: \"4225655b-c174-479e-a740-b768c9801287\") " pod="openstack/memcached-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.788923 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4225655b-c174-479e-a740-b768c9801287-config-data\") pod \"memcached-0\" (UID: \"4225655b-c174-479e-a740-b768c9801287\") " pod="openstack/memcached-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.788945 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4225655b-c174-479e-a740-b768c9801287-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4225655b-c174-479e-a740-b768c9801287\") " pod="openstack/memcached-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.788962 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb6c7\" (UniqueName: \"kubernetes.io/projected/4225655b-c174-479e-a740-b768c9801287-kube-api-access-hb6c7\") pod \"memcached-0\" (UID: \"4225655b-c174-479e-a740-b768c9801287\") " pod="openstack/memcached-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.890439 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4225655b-c174-479e-a740-b768c9801287-kolla-config\") pod \"memcached-0\" (UID: \"4225655b-c174-479e-a740-b768c9801287\") " pod="openstack/memcached-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.890548 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4225655b-c174-479e-a740-b768c9801287-config-data\") pod \"memcached-0\" (UID: \"4225655b-c174-479e-a740-b768c9801287\") " pod="openstack/memcached-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.890577 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4225655b-c174-479e-a740-b768c9801287-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4225655b-c174-479e-a740-b768c9801287\") " pod="openstack/memcached-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.890600 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb6c7\" (UniqueName: \"kubernetes.io/projected/4225655b-c174-479e-a740-b768c9801287-kube-api-access-hb6c7\") pod \"memcached-0\" (UID: \"4225655b-c174-479e-a740-b768c9801287\") " pod="openstack/memcached-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.890687 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4225655b-c174-479e-a740-b768c9801287-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4225655b-c174-479e-a740-b768c9801287\") " pod="openstack/memcached-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.891964 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4225655b-c174-479e-a740-b768c9801287-config-data\") pod \"memcached-0\" (UID: \"4225655b-c174-479e-a740-b768c9801287\") " pod="openstack/memcached-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.895972 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4225655b-c174-479e-a740-b768c9801287-kolla-config\") pod \"memcached-0\" (UID: \"4225655b-c174-479e-a740-b768c9801287\") " pod="openstack/memcached-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.917581 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/4225655b-c174-479e-a740-b768c9801287-memcached-tls-certs\") pod \"memcached-0\" (UID: \"4225655b-c174-479e-a740-b768c9801287\") " pod="openstack/memcached-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.955040 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4225655b-c174-479e-a740-b768c9801287-combined-ca-bundle\") pod \"memcached-0\" (UID: \"4225655b-c174-479e-a740-b768c9801287\") " pod="openstack/memcached-0" Dec 02 20:18:27 crc kubenswrapper[4807]: I1202 20:18:27.980510 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb6c7\" (UniqueName: \"kubernetes.io/projected/4225655b-c174-479e-a740-b768c9801287-kube-api-access-hb6c7\") pod \"memcached-0\" (UID: \"4225655b-c174-479e-a740-b768c9801287\") " pod="openstack/memcached-0" Dec 02 20:18:28 crc kubenswrapper[4807]: I1202 20:18:28.026447 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 20:18:28 crc kubenswrapper[4807]: I1202 20:18:28.078240 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 20:18:29 crc kubenswrapper[4807]: I1202 20:18:29.665894 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 20:18:29 crc kubenswrapper[4807]: I1202 20:18:29.670741 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 20:18:29 crc kubenswrapper[4807]: I1202 20:18:29.687369 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-wvbmg" Dec 02 20:18:29 crc kubenswrapper[4807]: I1202 20:18:29.688361 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 20:18:29 crc kubenswrapper[4807]: I1202 20:18:29.730684 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9bkw\" (UniqueName: \"kubernetes.io/projected/7e167e24-edba-4ff8-8b31-2c7141238bde-kube-api-access-d9bkw\") pod \"kube-state-metrics-0\" (UID: \"7e167e24-edba-4ff8-8b31-2c7141238bde\") " pod="openstack/kube-state-metrics-0" Dec 02 20:18:29 crc kubenswrapper[4807]: I1202 20:18:29.833009 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9bkw\" (UniqueName: \"kubernetes.io/projected/7e167e24-edba-4ff8-8b31-2c7141238bde-kube-api-access-d9bkw\") pod \"kube-state-metrics-0\" (UID: \"7e167e24-edba-4ff8-8b31-2c7141238bde\") " pod="openstack/kube-state-metrics-0" Dec 02 20:18:29 crc kubenswrapper[4807]: I1202 20:18:29.876176 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9bkw\" (UniqueName: \"kubernetes.io/projected/7e167e24-edba-4ff8-8b31-2c7141238bde-kube-api-access-d9bkw\") pod \"kube-state-metrics-0\" (UID: \"7e167e24-edba-4ff8-8b31-2c7141238bde\") " pod="openstack/kube-state-metrics-0" Dec 02 20:18:29 crc kubenswrapper[4807]: I1202 20:18:29.996966 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.059391 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.062455 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.065343 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.065849 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-nlqxt" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.066052 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.066086 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.070117 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.075613 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.077402 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.199537 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj5f5\" (UniqueName: \"kubernetes.io/projected/89cb183e-cc2d-4fd9-90d8-212c434f0d06-kube-api-access-cj5f5\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.200014 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.200064 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89cb183e-cc2d-4fd9-90d8-212c434f0d06-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.200128 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.200175 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89cb183e-cc2d-4fd9-90d8-212c434f0d06-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.200205 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-config\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.200241 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89cb183e-cc2d-4fd9-90d8-212c434f0d06-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.200312 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.304163 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89cb183e-cc2d-4fd9-90d8-212c434f0d06-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.304269 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.304326 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89cb183e-cc2d-4fd9-90d8-212c434f0d06-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.304349 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-config\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.304413 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89cb183e-cc2d-4fd9-90d8-212c434f0d06-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.304484 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.304508 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj5f5\" (UniqueName: \"kubernetes.io/projected/89cb183e-cc2d-4fd9-90d8-212c434f0d06-kube-api-access-cj5f5\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.304567 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.306956 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89cb183e-cc2d-4fd9-90d8-212c434f0d06-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.309476 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-config\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.309933 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.310242 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89cb183e-cc2d-4fd9-90d8-212c434f0d06-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.310371 4807 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.310407 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8dd81587a0a3f5d67a5d533af4320b55477e158168be00efde5dc29e79f819c0/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.316151 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89cb183e-cc2d-4fd9-90d8-212c434f0d06-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.320933 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj5f5\" (UniqueName: \"kubernetes.io/projected/89cb183e-cc2d-4fd9-90d8-212c434f0d06-kube-api-access-cj5f5\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.329588 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.350570 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") pod \"prometheus-metric-storage-0\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:31 crc kubenswrapper[4807]: I1202 20:18:31.391459 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.056935 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kstxp"] Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.060077 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.063407 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-5chhj" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.063452 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.063811 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.078686 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-pxxrz"] Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.081104 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.089379 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kstxp"] Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.129455 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-pxxrz"] Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.158626 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-var-lib\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.158702 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-var-log\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.158746 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-scripts\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.158784 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-ovn-controller-tls-certs\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.158810 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnvtz\" (UniqueName: \"kubernetes.io/projected/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-kube-api-access-bnvtz\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.158834 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-var-run-ovn\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.158856 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qcrb\" (UniqueName: \"kubernetes.io/projected/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-kube-api-access-4qcrb\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.158939 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-var-run\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.159001 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-var-run\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.159064 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-var-log-ovn\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.159098 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-etc-ovs\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.159322 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-scripts\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.159440 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-combined-ca-bundle\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.260883 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-var-run\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.260992 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-var-log-ovn\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.261025 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-etc-ovs\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.261556 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-etc-ovs\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.261647 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-scripts\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.261942 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-var-log-ovn\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.262035 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-var-run\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.262456 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-combined-ca-bundle\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.262662 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-var-lib\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.262911 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-var-lib\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.263011 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-var-log\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.263098 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-scripts\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.263197 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-var-log\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.263335 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-ovn-controller-tls-certs\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.263462 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnvtz\" (UniqueName: \"kubernetes.io/projected/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-kube-api-access-bnvtz\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.263561 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-var-run-ovn\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.263627 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-scripts\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.263732 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qcrb\" (UniqueName: \"kubernetes.io/projected/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-kube-api-access-4qcrb\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.263872 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-var-run\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.264095 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-var-run\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.264260 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-var-run-ovn\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.270668 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-scripts\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.287458 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-ovn-controller-tls-certs\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.287947 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-combined-ca-bundle\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.291377 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qcrb\" (UniqueName: \"kubernetes.io/projected/80844aa2-667c-4b2d-a55a-e5fa2cd3dd85-kube-api-access-4qcrb\") pod \"ovn-controller-kstxp\" (UID: \"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85\") " pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.292220 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnvtz\" (UniqueName: \"kubernetes.io/projected/f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd-kube-api-access-bnvtz\") pod \"ovn-controller-ovs-pxxrz\" (UID: \"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd\") " pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.384351 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kstxp" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.418012 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.497183 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.498583 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.501434 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.501954 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-kjvq2" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.502158 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.502589 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.502970 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.517771 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.669548 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.669637 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7a63890f-30c8-4538-903a-121488dba6bb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.669667 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a63890f-30c8-4538-903a-121488dba6bb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.669710 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a63890f-30c8-4538-903a-121488dba6bb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.669746 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a63890f-30c8-4538-903a-121488dba6bb-config\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.669763 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a63890f-30c8-4538-903a-121488dba6bb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.669779 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a63890f-30c8-4538-903a-121488dba6bb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.669818 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrwlx\" (UniqueName: \"kubernetes.io/projected/7a63890f-30c8-4538-903a-121488dba6bb-kube-api-access-zrwlx\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.770806 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrwlx\" (UniqueName: \"kubernetes.io/projected/7a63890f-30c8-4538-903a-121488dba6bb-kube-api-access-zrwlx\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.770864 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.770925 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7a63890f-30c8-4538-903a-121488dba6bb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.770955 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a63890f-30c8-4538-903a-121488dba6bb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.771002 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a63890f-30c8-4538-903a-121488dba6bb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.771027 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a63890f-30c8-4538-903a-121488dba6bb-config\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.771042 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a63890f-30c8-4538-903a-121488dba6bb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.771057 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a63890f-30c8-4538-903a-121488dba6bb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.771918 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.772052 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7a63890f-30c8-4538-903a-121488dba6bb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.773034 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a63890f-30c8-4538-903a-121488dba6bb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.773116 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a63890f-30c8-4538-903a-121488dba6bb-config\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.786059 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a63890f-30c8-4538-903a-121488dba6bb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.789688 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a63890f-30c8-4538-903a-121488dba6bb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.801518 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a63890f-30c8-4538-903a-121488dba6bb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.826063 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:33 crc kubenswrapper[4807]: I1202 20:18:33.831051 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrwlx\" (UniqueName: \"kubernetes.io/projected/7a63890f-30c8-4538-903a-121488dba6bb-kube-api-access-zrwlx\") pod \"ovsdbserver-nb-0\" (UID: \"7a63890f-30c8-4538-903a-121488dba6bb\") " pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:34 crc kubenswrapper[4807]: I1202 20:18:34.117749 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.535511 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.538530 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.543121 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.543641 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-2q2mm" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.543829 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.544385 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.550093 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.662108 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/47753529-368a-4c5c-a3f8-27ffd55e41d1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.662208 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/47753529-368a-4c5c-a3f8-27ffd55e41d1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.662257 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.662295 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47753529-368a-4c5c-a3f8-27ffd55e41d1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.662360 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47753529-368a-4c5c-a3f8-27ffd55e41d1-config\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.662390 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47753529-368a-4c5c-a3f8-27ffd55e41d1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.662422 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9fjj\" (UniqueName: \"kubernetes.io/projected/47753529-368a-4c5c-a3f8-27ffd55e41d1-kube-api-access-r9fjj\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.662460 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/47753529-368a-4c5c-a3f8-27ffd55e41d1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.764705 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9fjj\" (UniqueName: \"kubernetes.io/projected/47753529-368a-4c5c-a3f8-27ffd55e41d1-kube-api-access-r9fjj\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.764803 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/47753529-368a-4c5c-a3f8-27ffd55e41d1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.764873 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/47753529-368a-4c5c-a3f8-27ffd55e41d1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.764907 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/47753529-368a-4c5c-a3f8-27ffd55e41d1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.764941 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.764986 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47753529-368a-4c5c-a3f8-27ffd55e41d1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.765029 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47753529-368a-4c5c-a3f8-27ffd55e41d1-config\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.765051 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47753529-368a-4c5c-a3f8-27ffd55e41d1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.765773 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/47753529-368a-4c5c-a3f8-27ffd55e41d1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.765903 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.767127 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47753529-368a-4c5c-a3f8-27ffd55e41d1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.767754 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47753529-368a-4c5c-a3f8-27ffd55e41d1-config\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.775325 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47753529-368a-4c5c-a3f8-27ffd55e41d1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.775585 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/47753529-368a-4c5c-a3f8-27ffd55e41d1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.776206 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/47753529-368a-4c5c-a3f8-27ffd55e41d1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.789467 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9fjj\" (UniqueName: \"kubernetes.io/projected/47753529-368a-4c5c-a3f8-27ffd55e41d1-kube-api-access-r9fjj\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.795205 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"47753529-368a-4c5c-a3f8-27ffd55e41d1\") " pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:37 crc kubenswrapper[4807]: I1202 20:18:37.870052 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 20:18:38 crc kubenswrapper[4807]: W1202 20:18:38.488655 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2af53e2b_bee9_4bd4_aab6_e58d33057ac7.slice/crio-017f9569ab6a8f1f88a7ec92980fd56f03bac66b21db08931cfc99192a24a5f9 WatchSource:0}: Error finding container 017f9569ab6a8f1f88a7ec92980fd56f03bac66b21db08931cfc99192a24a5f9: Status 404 returned error can't find the container with id 017f9569ab6a8f1f88a7ec92980fd56f03bac66b21db08931cfc99192a24a5f9 Dec 02 20:18:39 crc kubenswrapper[4807]: I1202 20:18:39.365325 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2af53e2b-bee9-4bd4-aab6-e58d33057ac7","Type":"ContainerStarted","Data":"017f9569ab6a8f1f88a7ec92980fd56f03bac66b21db08931cfc99192a24a5f9"} Dec 02 20:18:39 crc kubenswrapper[4807]: I1202 20:18:39.367341 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef1b0038-6b67-47e8-92e8-06efe88df856","Type":"ContainerStarted","Data":"152c8403c19d89f86582ad83cbb2995d1d787feb835481ee548f97d80216db79"} Dec 02 20:18:42 crc kubenswrapper[4807]: I1202 20:18:42.984202 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-pxxrz"] Dec 02 20:18:45 crc kubenswrapper[4807]: I1202 20:18:45.836398 4807 scope.go:117] "RemoveContainer" containerID="636a7cec89badf87775131bad6dfea21c3f447c41529779862c3c0eb8d804be4" Dec 02 20:18:48 crc kubenswrapper[4807]: W1202 20:18:48.645584 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3e74bd8_f6ab_439d_b7b1_afa538fbbbcd.slice/crio-33b85affaacd4d40bee750a9ccf58920885b1b0ae3a3aa45d6383ce0cfddaf1d WatchSource:0}: Error finding container 33b85affaacd4d40bee750a9ccf58920885b1b0ae3a3aa45d6383ce0cfddaf1d: Status 404 returned error can't find the container with id 33b85affaacd4d40bee750a9ccf58920885b1b0ae3a3aa45d6383ce0cfddaf1d Dec 02 20:18:49 crc kubenswrapper[4807]: I1202 20:18:49.282394 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kstxp"] Dec 02 20:18:49 crc kubenswrapper[4807]: I1202 20:18:49.317354 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 20:18:49 crc kubenswrapper[4807]: I1202 20:18:49.511607 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pxxrz" event={"ID":"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd","Type":"ContainerStarted","Data":"33b85affaacd4d40bee750a9ccf58920885b1b0ae3a3aa45d6383ce0cfddaf1d"} Dec 02 20:18:49 crc kubenswrapper[4807]: W1202 20:18:49.603649 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47753529_368a_4c5c_a3f8_27ffd55e41d1.slice/crio-7dcfcc6f60c32987a0a1cf0c5f6cb82fa6b34784cf1bf1c559ec16c12fceb35e WatchSource:0}: Error finding container 7dcfcc6f60c32987a0a1cf0c5f6cb82fa6b34784cf1bf1c559ec16c12fceb35e: Status 404 returned error can't find the container with id 7dcfcc6f60c32987a0a1cf0c5f6cb82fa6b34784cf1bf1c559ec16c12fceb35e Dec 02 20:18:49 crc kubenswrapper[4807]: I1202 20:18:49.628246 4807 scope.go:117] "RemoveContainer" containerID="974fb0c3f5278a94df53a3be912219508b64e45c539d6ce04026e5d8bdbf2c2f" Dec 02 20:18:49 crc kubenswrapper[4807]: E1202 20:18:49.650206 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 20:18:49 crc kubenswrapper[4807]: E1202 20:18:49.650410 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qhmgq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-cpgvr_openstack(bfcf6243-4d13-42ce-ba2a-3b4774a2d4de): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 20:18:49 crc kubenswrapper[4807]: E1202 20:18:49.651597 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" podUID="bfcf6243-4d13-42ce-ba2a-3b4774a2d4de" Dec 02 20:18:49 crc kubenswrapper[4807]: W1202 20:18:49.665411 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80844aa2_667c_4b2d_a55a_e5fa2cd3dd85.slice/crio-7573ec33e6c443f88de0bad84899b230e06934dfb4b5b45f07f6469693eac2f2 WatchSource:0}: Error finding container 7573ec33e6c443f88de0bad84899b230e06934dfb4b5b45f07f6469693eac2f2: Status 404 returned error can't find the container with id 7573ec33e6c443f88de0bad84899b230e06934dfb4b5b45f07f6469693eac2f2 Dec 02 20:18:49 crc kubenswrapper[4807]: E1202 20:18:49.725072 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 20:18:49 crc kubenswrapper[4807]: E1202 20:18:49.725852 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4tdtl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-nnbmr_openstack(1c99b3b4-e736-49f9-a876-7c72a2cc021c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 20:18:49 crc kubenswrapper[4807]: E1202 20:18:49.726935 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" podUID="1c99b3b4-e736-49f9-a876-7c72a2cc021c" Dec 02 20:18:49 crc kubenswrapper[4807]: E1202 20:18:49.739970 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 20:18:49 crc kubenswrapper[4807]: E1202 20:18:49.740176 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5xk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-n7vn5_openstack(97ab33d4-57ad-4b4e-80b7-b93c947b8f41): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 20:18:49 crc kubenswrapper[4807]: E1202 20:18:49.740236 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 20:18:49 crc kubenswrapper[4807]: E1202 20:18:49.740299 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xts95,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-qzrxh_openstack(3496728f-063f-4e10-85d4-003a3b64bbe7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 20:18:49 crc kubenswrapper[4807]: E1202 20:18:49.741410 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-qzrxh" podUID="3496728f-063f-4e10-85d4-003a3b64bbe7" Dec 02 20:18:49 crc kubenswrapper[4807]: E1202 20:18:49.741465 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-n7vn5" podUID="97ab33d4-57ad-4b4e-80b7-b93c947b8f41" Dec 02 20:18:49 crc kubenswrapper[4807]: I1202 20:18:49.878033 4807 scope.go:117] "RemoveContainer" containerID="40e3c42c30a2521ed0b01de341dc35b22a97eda4a71dfd0682f99d7164e2fda8" Dec 02 20:18:50 crc kubenswrapper[4807]: I1202 20:18:50.167671 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 20:18:50 crc kubenswrapper[4807]: I1202 20:18:50.174783 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 20:18:50 crc kubenswrapper[4807]: I1202 20:18:50.366834 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 20:18:50 crc kubenswrapper[4807]: W1202 20:18:50.380218 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e167e24_edba_4ff8_8b31_2c7141238bde.slice/crio-aaeff63e620a323a30c4c1823703dee029930d029611990d40d66642df0ebbf2 WatchSource:0}: Error finding container aaeff63e620a323a30c4c1823703dee029930d029611990d40d66642df0ebbf2: Status 404 returned error can't find the container with id aaeff63e620a323a30c4c1823703dee029930d029611990d40d66642df0ebbf2 Dec 02 20:18:50 crc kubenswrapper[4807]: I1202 20:18:50.454125 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 20:18:50 crc kubenswrapper[4807]: W1202 20:18:50.462491 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a63890f_30c8_4538_903a_121488dba6bb.slice/crio-e9aa2d4c8705d279590bcd88025439812442649d3517f6407677f9ecfaa43f9d WatchSource:0}: Error finding container e9aa2d4c8705d279590bcd88025439812442649d3517f6407677f9ecfaa43f9d: Status 404 returned error can't find the container with id e9aa2d4c8705d279590bcd88025439812442649d3517f6407677f9ecfaa43f9d Dec 02 20:18:50 crc kubenswrapper[4807]: I1202 20:18:50.497825 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 20:18:50 crc kubenswrapper[4807]: I1202 20:18:50.512425 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 20:18:50 crc kubenswrapper[4807]: W1202 20:18:50.515269 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96ba9206_497c_4cd1_a16d_436d2ba285a7.slice/crio-e86f49500f0499ae2d1783416d6d1f48e3b64389703ebcba6e19b0c8f928d299 WatchSource:0}: Error finding container e86f49500f0499ae2d1783416d6d1f48e3b64389703ebcba6e19b0c8f928d299: Status 404 returned error can't find the container with id e86f49500f0499ae2d1783416d6d1f48e3b64389703ebcba6e19b0c8f928d299 Dec 02 20:18:50 crc kubenswrapper[4807]: I1202 20:18:50.522323 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89cb183e-cc2d-4fd9-90d8-212c434f0d06","Type":"ContainerStarted","Data":"663ce3df155777da12262f64155ce07df171f9bf4556ecc619de557d6233e6a0"} Dec 02 20:18:50 crc kubenswrapper[4807]: I1202 20:18:50.525283 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kstxp" event={"ID":"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85","Type":"ContainerStarted","Data":"7573ec33e6c443f88de0bad84899b230e06934dfb4b5b45f07f6469693eac2f2"} Dec 02 20:18:50 crc kubenswrapper[4807]: I1202 20:18:50.526864 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7e167e24-edba-4ff8-8b31-2c7141238bde","Type":"ContainerStarted","Data":"aaeff63e620a323a30c4c1823703dee029930d029611990d40d66642df0ebbf2"} Dec 02 20:18:50 crc kubenswrapper[4807]: I1202 20:18:50.529374 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"47753529-368a-4c5c-a3f8-27ffd55e41d1","Type":"ContainerStarted","Data":"7dcfcc6f60c32987a0a1cf0c5f6cb82fa6b34784cf1bf1c559ec16c12fceb35e"} Dec 02 20:18:50 crc kubenswrapper[4807]: I1202 20:18:50.530997 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"df8d83f3-6675-416b-a039-2aafac45fe18","Type":"ContainerStarted","Data":"6913040957c50ea0f70aaac0f125522972e496ff4faa54a5a2dcd59684586d55"} Dec 02 20:18:50 crc kubenswrapper[4807]: I1202 20:18:50.533179 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7a63890f-30c8-4538-903a-121488dba6bb","Type":"ContainerStarted","Data":"e9aa2d4c8705d279590bcd88025439812442649d3517f6407677f9ecfaa43f9d"} Dec 02 20:18:50 crc kubenswrapper[4807]: I1202 20:18:50.534571 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4225655b-c174-479e-a740-b768c9801287","Type":"ContainerStarted","Data":"d44aa5ca3eaa9f8eba1653a850c39f5eb41eb85441da8466c24e5b9823fc2a25"} Dec 02 20:18:50 crc kubenswrapper[4807]: E1202 20:18:50.537842 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" podUID="bfcf6243-4d13-42ce-ba2a-3b4774a2d4de" Dec 02 20:18:50 crc kubenswrapper[4807]: E1202 20:18:50.538809 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" podUID="1c99b3b4-e736-49f9-a876-7c72a2cc021c" Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.040431 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-n7vn5" Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.056943 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qzrxh" Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.154531 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ab33d4-57ad-4b4e-80b7-b93c947b8f41-config\") pod \"97ab33d4-57ad-4b4e-80b7-b93c947b8f41\" (UID: \"97ab33d4-57ad-4b4e-80b7-b93c947b8f41\") " Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.154664 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3496728f-063f-4e10-85d4-003a3b64bbe7-config\") pod \"3496728f-063f-4e10-85d4-003a3b64bbe7\" (UID: \"3496728f-063f-4e10-85d4-003a3b64bbe7\") " Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.154706 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3496728f-063f-4e10-85d4-003a3b64bbe7-dns-svc\") pod \"3496728f-063f-4e10-85d4-003a3b64bbe7\" (UID: \"3496728f-063f-4e10-85d4-003a3b64bbe7\") " Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.154828 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xts95\" (UniqueName: \"kubernetes.io/projected/3496728f-063f-4e10-85d4-003a3b64bbe7-kube-api-access-xts95\") pod \"3496728f-063f-4e10-85d4-003a3b64bbe7\" (UID: \"3496728f-063f-4e10-85d4-003a3b64bbe7\") " Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.154857 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5xk4\" (UniqueName: \"kubernetes.io/projected/97ab33d4-57ad-4b4e-80b7-b93c947b8f41-kube-api-access-h5xk4\") pod \"97ab33d4-57ad-4b4e-80b7-b93c947b8f41\" (UID: \"97ab33d4-57ad-4b4e-80b7-b93c947b8f41\") " Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.157296 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3496728f-063f-4e10-85d4-003a3b64bbe7-config" (OuterVolumeSpecName: "config") pod "3496728f-063f-4e10-85d4-003a3b64bbe7" (UID: "3496728f-063f-4e10-85d4-003a3b64bbe7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.157376 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ab33d4-57ad-4b4e-80b7-b93c947b8f41-config" (OuterVolumeSpecName: "config") pod "97ab33d4-57ad-4b4e-80b7-b93c947b8f41" (UID: "97ab33d4-57ad-4b4e-80b7-b93c947b8f41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.157779 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3496728f-063f-4e10-85d4-003a3b64bbe7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3496728f-063f-4e10-85d4-003a3b64bbe7" (UID: "3496728f-063f-4e10-85d4-003a3b64bbe7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.192497 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ab33d4-57ad-4b4e-80b7-b93c947b8f41-kube-api-access-h5xk4" (OuterVolumeSpecName: "kube-api-access-h5xk4") pod "97ab33d4-57ad-4b4e-80b7-b93c947b8f41" (UID: "97ab33d4-57ad-4b4e-80b7-b93c947b8f41"). InnerVolumeSpecName "kube-api-access-h5xk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.257043 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3496728f-063f-4e10-85d4-003a3b64bbe7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.257088 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5xk4\" (UniqueName: \"kubernetes.io/projected/97ab33d4-57ad-4b4e-80b7-b93c947b8f41-kube-api-access-h5xk4\") on node \"crc\" DevicePath \"\"" Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.257101 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97ab33d4-57ad-4b4e-80b7-b93c947b8f41-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.257111 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3496728f-063f-4e10-85d4-003a3b64bbe7-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.283357 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3496728f-063f-4e10-85d4-003a3b64bbe7-kube-api-access-xts95" (OuterVolumeSpecName: "kube-api-access-xts95") pod "3496728f-063f-4e10-85d4-003a3b64bbe7" (UID: "3496728f-063f-4e10-85d4-003a3b64bbe7"). InnerVolumeSpecName "kube-api-access-xts95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.360303 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xts95\" (UniqueName: \"kubernetes.io/projected/3496728f-063f-4e10-85d4-003a3b64bbe7-kube-api-access-xts95\") on node \"crc\" DevicePath \"\"" Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.563534 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qzrxh" Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.563542 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-qzrxh" event={"ID":"3496728f-063f-4e10-85d4-003a3b64bbe7","Type":"ContainerDied","Data":"0e6d5cdf3bb3094599b0aaae623b06493cb5f25c9b30e13ab4fb996e4982c1e3"} Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.575212 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2af53e2b-bee9-4bd4-aab6-e58d33057ac7","Type":"ContainerStarted","Data":"3a29642ba9086cef7d3fa427a48598985d20fd9275fb14e394c2f925dbccb1cb"} Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.579456 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-n7vn5" Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.580097 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-n7vn5" event={"ID":"97ab33d4-57ad-4b4e-80b7-b93c947b8f41","Type":"ContainerDied","Data":"4df17332ead7406bdf0163dfd744f576932c741b27ace1dea5056b061523d317"} Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.587292 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef1b0038-6b67-47e8-92e8-06efe88df856","Type":"ContainerStarted","Data":"7f285fca8e0cd1cf35353ce537c1073826621a4833a365e1187dd2c1899b4466"} Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.597777 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"96ba9206-497c-4cd1-a16d-436d2ba285a7","Type":"ContainerStarted","Data":"e86f49500f0499ae2d1783416d6d1f48e3b64389703ebcba6e19b0c8f928d299"} Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.690487 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qzrxh"] Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.708521 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qzrxh"] Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.726158 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n7vn5"] Dec 02 20:18:51 crc kubenswrapper[4807]: I1202 20:18:51.733536 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-n7vn5"] Dec 02 20:18:52 crc kubenswrapper[4807]: I1202 20:18:52.984077 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3496728f-063f-4e10-85d4-003a3b64bbe7" path="/var/lib/kubelet/pods/3496728f-063f-4e10-85d4-003a3b64bbe7/volumes" Dec 02 20:18:52 crc kubenswrapper[4807]: I1202 20:18:52.985446 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ab33d4-57ad-4b4e-80b7-b93c947b8f41" path="/var/lib/kubelet/pods/97ab33d4-57ad-4b4e-80b7-b93c947b8f41/volumes" Dec 02 20:18:59 crc kubenswrapper[4807]: I1202 20:18:59.676546 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"df8d83f3-6675-416b-a039-2aafac45fe18","Type":"ContainerStarted","Data":"dc1b88c1f0b320b3d2106fc6c6a1ebc1af72e39f826ffba508b72c4ab03c83c5"} Dec 02 20:18:59 crc kubenswrapper[4807]: I1202 20:18:59.683110 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7a63890f-30c8-4538-903a-121488dba6bb","Type":"ContainerStarted","Data":"434334ecf6d5f18aa16d9a59bbc5ff3c497fec7748eb1efabe081eb0805ad2a9"} Dec 02 20:18:59 crc kubenswrapper[4807]: I1202 20:18:59.687351 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4225655b-c174-479e-a740-b768c9801287","Type":"ContainerStarted","Data":"e9ef7cff4b6ad8304c3192aa0d98f073112f22b16b72a35a2dcc1d5dcfd74254"} Dec 02 20:18:59 crc kubenswrapper[4807]: I1202 20:18:59.687489 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 02 20:18:59 crc kubenswrapper[4807]: I1202 20:18:59.690307 4807 generic.go:334] "Generic (PLEG): container finished" podID="f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd" containerID="23095cb3615ec2dad924297c305be2aa233048091b9b6067d63b278dc6193cc1" exitCode=0 Dec 02 20:18:59 crc kubenswrapper[4807]: I1202 20:18:59.690388 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pxxrz" event={"ID":"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd","Type":"ContainerDied","Data":"23095cb3615ec2dad924297c305be2aa233048091b9b6067d63b278dc6193cc1"} Dec 02 20:18:59 crc kubenswrapper[4807]: I1202 20:18:59.692258 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kstxp" event={"ID":"80844aa2-667c-4b2d-a55a-e5fa2cd3dd85","Type":"ContainerStarted","Data":"cc77b5395a645f6e425e1f1b2c089b8b4a078dd0bf05e9b15fc05d6eaefa98e6"} Dec 02 20:18:59 crc kubenswrapper[4807]: I1202 20:18:59.692411 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-kstxp" Dec 02 20:18:59 crc kubenswrapper[4807]: I1202 20:18:59.888209 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"96ba9206-497c-4cd1-a16d-436d2ba285a7","Type":"ContainerStarted","Data":"a5c63ac2a0be4e63320992a315b8d65eda7f7ba42e451e79276fc1b3172bf9dd"} Dec 02 20:18:59 crc kubenswrapper[4807]: I1202 20:18:59.894210 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7e167e24-edba-4ff8-8b31-2c7141238bde","Type":"ContainerStarted","Data":"1f01d08f38878ef4f03f695bea0bf8f233d60d8b3243cc825254999bc057a8d7"} Dec 02 20:18:59 crc kubenswrapper[4807]: I1202 20:18:59.895138 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 20:18:59 crc kubenswrapper[4807]: I1202 20:18:59.897220 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"47753529-368a-4c5c-a3f8-27ffd55e41d1","Type":"ContainerStarted","Data":"129a2e9ea9ab1c872982bfd6eb30348c9cda1cfa70fa5ed4af2272d915a2b69b"} Dec 02 20:18:59 crc kubenswrapper[4807]: I1202 20:18:59.956483 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=26.199948731 podStartE2EDuration="32.956458423s" podCreationTimestamp="2025-12-02 20:18:27 +0000 UTC" firstStartedPulling="2025-12-02 20:18:50.51540009 +0000 UTC m=+1265.816307595" lastFinishedPulling="2025-12-02 20:18:57.271909792 +0000 UTC m=+1272.572817287" observedRunningTime="2025-12-02 20:18:59.904473035 +0000 UTC m=+1275.205380530" watchObservedRunningTime="2025-12-02 20:18:59.956458423 +0000 UTC m=+1275.257365918" Dec 02 20:18:59 crc kubenswrapper[4807]: I1202 20:18:59.959208 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-kstxp" podStartSLOduration=19.376818669 podStartE2EDuration="26.959200525s" podCreationTimestamp="2025-12-02 20:18:33 +0000 UTC" firstStartedPulling="2025-12-02 20:18:49.689364362 +0000 UTC m=+1264.990271857" lastFinishedPulling="2025-12-02 20:18:57.271746218 +0000 UTC m=+1272.572653713" observedRunningTime="2025-12-02 20:18:59.924845702 +0000 UTC m=+1275.225753197" watchObservedRunningTime="2025-12-02 20:18:59.959200525 +0000 UTC m=+1275.260108020" Dec 02 20:18:59 crc kubenswrapper[4807]: I1202 20:18:59.978205 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=22.856325406 podStartE2EDuration="30.978185171s" podCreationTimestamp="2025-12-02 20:18:29 +0000 UTC" firstStartedPulling="2025-12-02 20:18:50.385649745 +0000 UTC m=+1265.686557240" lastFinishedPulling="2025-12-02 20:18:58.50750951 +0000 UTC m=+1273.808417005" observedRunningTime="2025-12-02 20:18:59.957052051 +0000 UTC m=+1275.257959546" watchObservedRunningTime="2025-12-02 20:18:59.978185171 +0000 UTC m=+1275.279092666" Dec 02 20:19:00 crc kubenswrapper[4807]: I1202 20:19:00.912831 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pxxrz" event={"ID":"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd","Type":"ContainerStarted","Data":"4858a14c821012b38011f28fb03da2f5cc40a0319353e636b50196f71e9dd82b"} Dec 02 20:19:00 crc kubenswrapper[4807]: I1202 20:19:00.913295 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pxxrz" event={"ID":"f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd","Type":"ContainerStarted","Data":"eef2166c1eebe8a7526a3262360ad5b5b8baed296633ac92d75df60a734bccd6"} Dec 02 20:19:00 crc kubenswrapper[4807]: I1202 20:19:00.958169 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-pxxrz" podStartSLOduration=19.580298582 podStartE2EDuration="27.958126463s" podCreationTimestamp="2025-12-02 20:18:33 +0000 UTC" firstStartedPulling="2025-12-02 20:18:48.674552562 +0000 UTC m=+1263.975460047" lastFinishedPulling="2025-12-02 20:18:57.052380433 +0000 UTC m=+1272.353287928" observedRunningTime="2025-12-02 20:19:00.948792325 +0000 UTC m=+1276.249699850" watchObservedRunningTime="2025-12-02 20:19:00.958126463 +0000 UTC m=+1276.259033988" Dec 02 20:19:01 crc kubenswrapper[4807]: I1202 20:19:01.923133 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:19:01 crc kubenswrapper[4807]: I1202 20:19:01.923189 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:19:02 crc kubenswrapper[4807]: I1202 20:19:02.939266 4807 generic.go:334] "Generic (PLEG): container finished" podID="96ba9206-497c-4cd1-a16d-436d2ba285a7" containerID="a5c63ac2a0be4e63320992a315b8d65eda7f7ba42e451e79276fc1b3172bf9dd" exitCode=0 Dec 02 20:19:02 crc kubenswrapper[4807]: I1202 20:19:02.939307 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"96ba9206-497c-4cd1-a16d-436d2ba285a7","Type":"ContainerDied","Data":"a5c63ac2a0be4e63320992a315b8d65eda7f7ba42e451e79276fc1b3172bf9dd"} Dec 02 20:19:02 crc kubenswrapper[4807]: I1202 20:19:02.944387 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89cb183e-cc2d-4fd9-90d8-212c434f0d06","Type":"ContainerStarted","Data":"616e718fa94758b80bd2d67a02d049c69f5f50978b029f71a4580dd58861f362"} Dec 02 20:19:03 crc kubenswrapper[4807]: I1202 20:19:03.080004 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 02 20:19:04 crc kubenswrapper[4807]: I1202 20:19:04.966089 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" event={"ID":"1c99b3b4-e736-49f9-a876-7c72a2cc021c","Type":"ContainerStarted","Data":"0947563239a4735425fb08760cae661a1b66b47225f9a03820c8226aed5fc7d6"} Dec 02 20:19:05 crc kubenswrapper[4807]: I1202 20:19:05.978356 4807 generic.go:334] "Generic (PLEG): container finished" podID="1c99b3b4-e736-49f9-a876-7c72a2cc021c" containerID="0947563239a4735425fb08760cae661a1b66b47225f9a03820c8226aed5fc7d6" exitCode=0 Dec 02 20:19:05 crc kubenswrapper[4807]: I1202 20:19:05.978416 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" event={"ID":"1c99b3b4-e736-49f9-a876-7c72a2cc021c","Type":"ContainerDied","Data":"0947563239a4735425fb08760cae661a1b66b47225f9a03820c8226aed5fc7d6"} Dec 02 20:19:06 crc kubenswrapper[4807]: I1202 20:19:06.996763 4807 generic.go:334] "Generic (PLEG): container finished" podID="df8d83f3-6675-416b-a039-2aafac45fe18" containerID="dc1b88c1f0b320b3d2106fc6c6a1ebc1af72e39f826ffba508b72c4ab03c83c5" exitCode=0 Dec 02 20:19:06 crc kubenswrapper[4807]: I1202 20:19:06.996828 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"df8d83f3-6675-416b-a039-2aafac45fe18","Type":"ContainerDied","Data":"dc1b88c1f0b320b3d2106fc6c6a1ebc1af72e39f826ffba508b72c4ab03c83c5"} Dec 02 20:19:09 crc kubenswrapper[4807]: I1202 20:19:09.015381 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"47753529-368a-4c5c-a3f8-27ffd55e41d1","Type":"ContainerStarted","Data":"fe86f31fd100171bc6b5ed0a0a8a451ad1df6b29a64d239d773fe3b2d1be1b15"} Dec 02 20:19:09 crc kubenswrapper[4807]: I1202 20:19:09.018082 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7a63890f-30c8-4538-903a-121488dba6bb","Type":"ContainerStarted","Data":"9b93a59fff5fb54c54d9f2ded24716a5d1cba81869e14037d631e6312002536c"} Dec 02 20:19:09 crc kubenswrapper[4807]: I1202 20:19:09.024033 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"96ba9206-497c-4cd1-a16d-436d2ba285a7","Type":"ContainerStarted","Data":"8b03cfce5998a0e6fb48a59112134feed63d519847abf7078ba538cadfb70d4d"} Dec 02 20:19:10 crc kubenswrapper[4807]: I1202 20:19:10.009168 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 20:19:10 crc kubenswrapper[4807]: I1202 20:19:10.171873 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cpgvr"] Dec 02 20:19:10 crc kubenswrapper[4807]: I1202 20:19:10.283414 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-fkzfp"] Dec 02 20:19:10 crc kubenswrapper[4807]: I1202 20:19:10.285203 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" Dec 02 20:19:10 crc kubenswrapper[4807]: I1202 20:19:10.302886 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-fkzfp"] Dec 02 20:19:10 crc kubenswrapper[4807]: I1202 20:19:10.384222 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5gnt\" (UniqueName: \"kubernetes.io/projected/18a453c8-e7f8-4c78-814c-d1d9f79320b5-kube-api-access-p5gnt\") pod \"dnsmasq-dns-7cb5889db5-fkzfp\" (UID: \"18a453c8-e7f8-4c78-814c-d1d9f79320b5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" Dec 02 20:19:10 crc kubenswrapper[4807]: I1202 20:19:10.384279 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18a453c8-e7f8-4c78-814c-d1d9f79320b5-config\") pod \"dnsmasq-dns-7cb5889db5-fkzfp\" (UID: \"18a453c8-e7f8-4c78-814c-d1d9f79320b5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" Dec 02 20:19:10 crc kubenswrapper[4807]: I1202 20:19:10.384360 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18a453c8-e7f8-4c78-814c-d1d9f79320b5-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-fkzfp\" (UID: \"18a453c8-e7f8-4c78-814c-d1d9f79320b5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" Dec 02 20:19:10 crc kubenswrapper[4807]: I1202 20:19:10.486552 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5gnt\" (UniqueName: \"kubernetes.io/projected/18a453c8-e7f8-4c78-814c-d1d9f79320b5-kube-api-access-p5gnt\") pod \"dnsmasq-dns-7cb5889db5-fkzfp\" (UID: \"18a453c8-e7f8-4c78-814c-d1d9f79320b5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" Dec 02 20:19:10 crc kubenswrapper[4807]: I1202 20:19:10.486607 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18a453c8-e7f8-4c78-814c-d1d9f79320b5-config\") pod \"dnsmasq-dns-7cb5889db5-fkzfp\" (UID: \"18a453c8-e7f8-4c78-814c-d1d9f79320b5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" Dec 02 20:19:10 crc kubenswrapper[4807]: I1202 20:19:10.487434 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18a453c8-e7f8-4c78-814c-d1d9f79320b5-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-fkzfp\" (UID: \"18a453c8-e7f8-4c78-814c-d1d9f79320b5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" Dec 02 20:19:10 crc kubenswrapper[4807]: I1202 20:19:10.488167 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18a453c8-e7f8-4c78-814c-d1d9f79320b5-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-fkzfp\" (UID: \"18a453c8-e7f8-4c78-814c-d1d9f79320b5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" Dec 02 20:19:10 crc kubenswrapper[4807]: I1202 20:19:10.489149 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18a453c8-e7f8-4c78-814c-d1d9f79320b5-config\") pod \"dnsmasq-dns-7cb5889db5-fkzfp\" (UID: \"18a453c8-e7f8-4c78-814c-d1d9f79320b5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" Dec 02 20:19:10 crc kubenswrapper[4807]: I1202 20:19:10.523603 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5gnt\" (UniqueName: \"kubernetes.io/projected/18a453c8-e7f8-4c78-814c-d1d9f79320b5-kube-api-access-p5gnt\") pod \"dnsmasq-dns-7cb5889db5-fkzfp\" (UID: \"18a453c8-e7f8-4c78-814c-d1d9f79320b5\") " pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" Dec 02 20:19:10 crc kubenswrapper[4807]: I1202 20:19:10.605152 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" Dec 02 20:19:10 crc kubenswrapper[4807]: I1202 20:19:10.871059 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-fkzfp"] Dec 02 20:19:10 crc kubenswrapper[4807]: W1202 20:19:10.876985 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18a453c8_e7f8_4c78_814c_d1d9f79320b5.slice/crio-00f39d22b1fc96901e3533136645c981710a52c223b3de0945b4aab9c3ab00a0 WatchSource:0}: Error finding container 00f39d22b1fc96901e3533136645c981710a52c223b3de0945b4aab9c3ab00a0: Status 404 returned error can't find the container with id 00f39d22b1fc96901e3533136645c981710a52c223b3de0945b4aab9c3ab00a0 Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.053266 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" event={"ID":"18a453c8-e7f8-4c78-814c-d1d9f79320b5","Type":"ContainerStarted","Data":"00f39d22b1fc96901e3533136645c981710a52c223b3de0945b4aab9c3ab00a0"} Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.054987 4807 generic.go:334] "Generic (PLEG): container finished" podID="bfcf6243-4d13-42ce-ba2a-3b4774a2d4de" containerID="b1af5793c20a49a944cdb922fe9617e8cc9f9b455afa0ac22477b44b5c9f37b4" exitCode=0 Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.055218 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" event={"ID":"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de","Type":"ContainerDied","Data":"b1af5793c20a49a944cdb922fe9617e8cc9f9b455afa0ac22477b44b5c9f37b4"} Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.057253 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"df8d83f3-6675-416b-a039-2aafac45fe18","Type":"ContainerStarted","Data":"87c92454171e1b7261c325560105c3ecbf7540448f182b97b860b5345abe4e76"} Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.059275 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89cb183e-cc2d-4fd9-90d8-212c434f0d06","Type":"ContainerDied","Data":"616e718fa94758b80bd2d67a02d049c69f5f50978b029f71a4580dd58861f362"} Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.060996 4807 generic.go:334] "Generic (PLEG): container finished" podID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerID="616e718fa94758b80bd2d67a02d049c69f5f50978b029f71a4580dd58861f362" exitCode=0 Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.119350 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=38.366216067 podStartE2EDuration="45.119332309s" podCreationTimestamp="2025-12-02 20:18:26 +0000 UTC" firstStartedPulling="2025-12-02 20:18:50.517939495 +0000 UTC m=+1265.818846990" lastFinishedPulling="2025-12-02 20:18:57.271055737 +0000 UTC m=+1272.571963232" observedRunningTime="2025-12-02 20:19:11.118174435 +0000 UTC m=+1286.419081950" watchObservedRunningTime="2025-12-02 20:19:11.119332309 +0000 UTC m=+1286.420239804" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.181558 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=26.561840179 podStartE2EDuration="39.181535612s" podCreationTimestamp="2025-12-02 20:18:32 +0000 UTC" firstStartedPulling="2025-12-02 20:18:50.484815849 +0000 UTC m=+1265.785723354" lastFinishedPulling="2025-12-02 20:19:03.104511292 +0000 UTC m=+1278.405418787" observedRunningTime="2025-12-02 20:19:11.176626706 +0000 UTC m=+1286.477534201" watchObservedRunningTime="2025-12-02 20:19:11.181535612 +0000 UTC m=+1286.482443107" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.223821 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=21.567459626 podStartE2EDuration="35.223798181s" podCreationTimestamp="2025-12-02 20:18:36 +0000 UTC" firstStartedPulling="2025-12-02 20:18:49.608798112 +0000 UTC m=+1264.909705607" lastFinishedPulling="2025-12-02 20:19:03.265136667 +0000 UTC m=+1278.566044162" observedRunningTime="2025-12-02 20:19:11.215281718 +0000 UTC m=+1286.516189213" watchObservedRunningTime="2025-12-02 20:19:11.223798181 +0000 UTC m=+1286.524705666" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.245956 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=38.958362979 podStartE2EDuration="46.245933831s" podCreationTimestamp="2025-12-02 20:18:25 +0000 UTC" firstStartedPulling="2025-12-02 20:18:50.176789533 +0000 UTC m=+1265.477697028" lastFinishedPulling="2025-12-02 20:18:57.464360375 +0000 UTC m=+1272.765267880" observedRunningTime="2025-12-02 20:19:11.232992345 +0000 UTC m=+1286.533899840" watchObservedRunningTime="2025-12-02 20:19:11.245933831 +0000 UTC m=+1286.546841326" Dec 02 20:19:11 crc kubenswrapper[4807]: E1202 20:19:11.302259 4807 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 02 20:19:11 crc kubenswrapper[4807]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/1c99b3b4-e736-49f9-a876-7c72a2cc021c/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 02 20:19:11 crc kubenswrapper[4807]: > podSandboxID="a28fbf5e1a34b7f6e9cc7173865cdcd9c9b5c065c7098658726143c36da04c2c" Dec 02 20:19:11 crc kubenswrapper[4807]: E1202 20:19:11.302406 4807 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 02 20:19:11 crc kubenswrapper[4807]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4tdtl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-nnbmr_openstack(1c99b3b4-e736-49f9-a876-7c72a2cc021c): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/1c99b3b4-e736-49f9-a876-7c72a2cc021c/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 02 20:19:11 crc kubenswrapper[4807]: > logger="UnhandledError" Dec 02 20:19:11 crc kubenswrapper[4807]: E1202 20:19:11.303626 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/1c99b3b4-e736-49f9-a876-7c72a2cc021c/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" podUID="1c99b3b4-e736-49f9-a876-7c72a2cc021c" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.368343 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.507616 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhmgq\" (UniqueName: \"kubernetes.io/projected/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-kube-api-access-qhmgq\") pod \"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de\" (UID: \"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de\") " Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.507784 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-dns-svc\") pod \"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de\" (UID: \"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de\") " Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.507979 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-config\") pod \"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de\" (UID: \"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de\") " Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.512263 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-kube-api-access-qhmgq" (OuterVolumeSpecName: "kube-api-access-qhmgq") pod "bfcf6243-4d13-42ce-ba2a-3b4774a2d4de" (UID: "bfcf6243-4d13-42ce-ba2a-3b4774a2d4de"). InnerVolumeSpecName "kube-api-access-qhmgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.528949 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-config" (OuterVolumeSpecName: "config") pod "bfcf6243-4d13-42ce-ba2a-3b4774a2d4de" (UID: "bfcf6243-4d13-42ce-ba2a-3b4774a2d4de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.530346 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bfcf6243-4d13-42ce-ba2a-3b4774a2d4de" (UID: "bfcf6243-4d13-42ce-ba2a-3b4774a2d4de"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.610375 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.610427 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhmgq\" (UniqueName: \"kubernetes.io/projected/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-kube-api-access-qhmgq\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.610438 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.807704 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 02 20:19:11 crc kubenswrapper[4807]: E1202 20:19:11.808504 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfcf6243-4d13-42ce-ba2a-3b4774a2d4de" containerName="init" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.808525 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfcf6243-4d13-42ce-ba2a-3b4774a2d4de" containerName="init" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.808685 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfcf6243-4d13-42ce-ba2a-3b4774a2d4de" containerName="init" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.814368 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.816805 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-58ldt" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.816989 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.817251 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.818048 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.841243 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.916227 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/caaccc0b-6743-4907-9d87-f4ab26c931e2-cache\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.916277 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.916330 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.916366 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59ftp\" (UniqueName: \"kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-kube-api-access-59ftp\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:11 crc kubenswrapper[4807]: I1202 20:19:11.916381 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/caaccc0b-6743-4907-9d87-f4ab26c931e2-lock\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.018329 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/caaccc0b-6743-4907-9d87-f4ab26c931e2-cache\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.018426 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.018493 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.018541 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59ftp\" (UniqueName: \"kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-kube-api-access-59ftp\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.018564 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/caaccc0b-6743-4907-9d87-f4ab26c931e2-lock\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:12 crc kubenswrapper[4807]: E1202 20:19:12.018787 4807 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 20:19:12 crc kubenswrapper[4807]: E1202 20:19:12.018831 4807 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 20:19:12 crc kubenswrapper[4807]: E1202 20:19:12.018909 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift podName:caaccc0b-6743-4907-9d87-f4ab26c931e2 nodeName:}" failed. No retries permitted until 2025-12-02 20:19:12.518882337 +0000 UTC m=+1287.819789842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift") pod "swift-storage-0" (UID: "caaccc0b-6743-4907-9d87-f4ab26c931e2") : configmap "swift-ring-files" not found Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.019011 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.019183 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/caaccc0b-6743-4907-9d87-f4ab26c931e2-cache\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.019240 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/caaccc0b-6743-4907-9d87-f4ab26c931e2-lock\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.038333 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59ftp\" (UniqueName: \"kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-kube-api-access-59ftp\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.041249 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.069577 4807 generic.go:334] "Generic (PLEG): container finished" podID="18a453c8-e7f8-4c78-814c-d1d9f79320b5" containerID="bc5db0d31429d1be5c9b239c4c336fb8e27d6756a85325dac8fde8af9bc0c300" exitCode=0 Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.069637 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" event={"ID":"18a453c8-e7f8-4c78-814c-d1d9f79320b5","Type":"ContainerDied","Data":"bc5db0d31429d1be5c9b239c4c336fb8e27d6756a85325dac8fde8af9bc0c300"} Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.072221 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" event={"ID":"bfcf6243-4d13-42ce-ba2a-3b4774a2d4de","Type":"ContainerDied","Data":"a7aa50202b82142f653d9e6602bda689faecaf032754478bb11d67ef4bceaa88"} Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.072281 4807 scope.go:117] "RemoveContainer" containerID="b1af5793c20a49a944cdb922fe9617e8cc9f9b455afa0ac22477b44b5c9f37b4" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.072366 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-cpgvr" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.195639 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-b94t7"] Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.196839 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.211825 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-b94t7"] Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.212327 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.212552 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.212734 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.246132 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cpgvr"] Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.259921 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-cpgvr"] Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.268979 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-msbf5"] Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.270416 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.274274 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-b94t7"] Dec 02 20:19:12 crc kubenswrapper[4807]: E1202 20:19:12.274977 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-56fmx ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-56fmx ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-b94t7" podUID="ee2880b4-fddf-40fa-a760-af84e276af14" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.284638 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-msbf5"] Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.336594 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-etc-swift\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.337121 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-scripts\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.337198 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-combined-ca-bundle\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.337298 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56fmx\" (UniqueName: \"kubernetes.io/projected/ee2880b4-fddf-40fa-a760-af84e276af14-kube-api-access-56fmx\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.337419 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-ring-data-devices\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.337529 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-dispersionconf\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.337625 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-dispersionconf\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.337742 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-swiftconf\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.337848 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-swiftconf\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.337971 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee2880b4-fddf-40fa-a760-af84e276af14-scripts\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.338071 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9bnh\" (UniqueName: \"kubernetes.io/projected/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-kube-api-access-f9bnh\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.338163 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ee2880b4-fddf-40fa-a760-af84e276af14-etc-swift\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.338283 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-combined-ca-bundle\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.338382 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ee2880b4-fddf-40fa-a760-af84e276af14-ring-data-devices\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.439997 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-etc-swift\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.440071 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-scripts\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.440090 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-combined-ca-bundle\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.440121 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56fmx\" (UniqueName: \"kubernetes.io/projected/ee2880b4-fddf-40fa-a760-af84e276af14-kube-api-access-56fmx\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.440147 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-ring-data-devices\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.440172 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-dispersionconf\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.440190 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-dispersionconf\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.440208 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-swiftconf\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.440255 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-swiftconf\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.440306 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee2880b4-fddf-40fa-a760-af84e276af14-scripts\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.440332 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9bnh\" (UniqueName: \"kubernetes.io/projected/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-kube-api-access-f9bnh\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.440353 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ee2880b4-fddf-40fa-a760-af84e276af14-etc-swift\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.440412 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-combined-ca-bundle\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.440439 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ee2880b4-fddf-40fa-a760-af84e276af14-ring-data-devices\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.440481 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-etc-swift\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.441043 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-scripts\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.441046 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ee2880b4-fddf-40fa-a760-af84e276af14-ring-data-devices\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.441634 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ee2880b4-fddf-40fa-a760-af84e276af14-etc-swift\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.441780 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee2880b4-fddf-40fa-a760-af84e276af14-scripts\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.442318 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-ring-data-devices\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.446055 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-dispersionconf\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.446515 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-combined-ca-bundle\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.447003 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-dispersionconf\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.447188 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-swiftconf\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.454904 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-combined-ca-bundle\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.456922 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-swiftconf\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.459929 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56fmx\" (UniqueName: \"kubernetes.io/projected/ee2880b4-fddf-40fa-a760-af84e276af14-kube-api-access-56fmx\") pod \"swift-ring-rebalance-b94t7\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.461927 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9bnh\" (UniqueName: \"kubernetes.io/projected/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-kube-api-access-f9bnh\") pod \"swift-ring-rebalance-msbf5\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.542087 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:12 crc kubenswrapper[4807]: E1202 20:19:12.542426 4807 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 20:19:12 crc kubenswrapper[4807]: E1202 20:19:12.542448 4807 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 20:19:12 crc kubenswrapper[4807]: E1202 20:19:12.542503 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift podName:caaccc0b-6743-4907-9d87-f4ab26c931e2 nodeName:}" failed. No retries permitted until 2025-12-02 20:19:13.542481684 +0000 UTC m=+1288.843389179 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift") pod "swift-storage-0" (UID: "caaccc0b-6743-4907-9d87-f4ab26c931e2") : configmap "swift-ring-files" not found Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.586127 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.870461 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 02 20:19:12 crc kubenswrapper[4807]: I1202 20:19:12.984702 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfcf6243-4d13-42ce-ba2a-3b4774a2d4de" path="/var/lib/kubelet/pods/bfcf6243-4d13-42ce-ba2a-3b4774a2d4de/volumes" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.092466 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" event={"ID":"18a453c8-e7f8-4c78-814c-d1d9f79320b5","Type":"ContainerStarted","Data":"26fd402d7279cd560b92e936f5ce08e9d54081c5dc88f24e5e423127e3bfa8ba"} Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.093787 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.097464 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.106326 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-msbf5"] Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.118115 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.124940 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" podStartSLOduration=3.124915714 podStartE2EDuration="3.124915714s" podCreationTimestamp="2025-12-02 20:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:19:13.116529894 +0000 UTC m=+1288.417437389" watchObservedRunningTime="2025-12-02 20:19:13.124915714 +0000 UTC m=+1288.425823209" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.152101 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.165756 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.258273 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56fmx\" (UniqueName: \"kubernetes.io/projected/ee2880b4-fddf-40fa-a760-af84e276af14-kube-api-access-56fmx\") pod \"ee2880b4-fddf-40fa-a760-af84e276af14\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.259520 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ee2880b4-fddf-40fa-a760-af84e276af14-ring-data-devices\") pod \"ee2880b4-fddf-40fa-a760-af84e276af14\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.259996 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee2880b4-fddf-40fa-a760-af84e276af14-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ee2880b4-fddf-40fa-a760-af84e276af14" (UID: "ee2880b4-fddf-40fa-a760-af84e276af14"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.260937 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-dispersionconf\") pod \"ee2880b4-fddf-40fa-a760-af84e276af14\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.261070 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-swiftconf\") pod \"ee2880b4-fddf-40fa-a760-af84e276af14\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.261291 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-combined-ca-bundle\") pod \"ee2880b4-fddf-40fa-a760-af84e276af14\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.261399 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee2880b4-fddf-40fa-a760-af84e276af14-scripts\") pod \"ee2880b4-fddf-40fa-a760-af84e276af14\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.261517 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ee2880b4-fddf-40fa-a760-af84e276af14-etc-swift\") pod \"ee2880b4-fddf-40fa-a760-af84e276af14\" (UID: \"ee2880b4-fddf-40fa-a760-af84e276af14\") " Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.261885 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee2880b4-fddf-40fa-a760-af84e276af14-scripts" (OuterVolumeSpecName: "scripts") pod "ee2880b4-fddf-40fa-a760-af84e276af14" (UID: "ee2880b4-fddf-40fa-a760-af84e276af14"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.262216 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ee2880b4-fddf-40fa-a760-af84e276af14-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.262377 4807 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ee2880b4-fddf-40fa-a760-af84e276af14-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.262336 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee2880b4-fddf-40fa-a760-af84e276af14-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ee2880b4-fddf-40fa-a760-af84e276af14" (UID: "ee2880b4-fddf-40fa-a760-af84e276af14"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.265396 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee2880b4-fddf-40fa-a760-af84e276af14" (UID: "ee2880b4-fddf-40fa-a760-af84e276af14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.265536 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee2880b4-fddf-40fa-a760-af84e276af14-kube-api-access-56fmx" (OuterVolumeSpecName: "kube-api-access-56fmx") pod "ee2880b4-fddf-40fa-a760-af84e276af14" (UID: "ee2880b4-fddf-40fa-a760-af84e276af14"). InnerVolumeSpecName "kube-api-access-56fmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.265824 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ee2880b4-fddf-40fa-a760-af84e276af14" (UID: "ee2880b4-fddf-40fa-a760-af84e276af14"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.267538 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ee2880b4-fddf-40fa-a760-af84e276af14" (UID: "ee2880b4-fddf-40fa-a760-af84e276af14"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.363544 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.363582 4807 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ee2880b4-fddf-40fa-a760-af84e276af14-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.363594 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56fmx\" (UniqueName: \"kubernetes.io/projected/ee2880b4-fddf-40fa-a760-af84e276af14-kube-api-access-56fmx\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.363605 4807 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.363615 4807 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ee2880b4-fddf-40fa-a760-af84e276af14-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.567022 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:13 crc kubenswrapper[4807]: E1202 20:19:13.567313 4807 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 20:19:13 crc kubenswrapper[4807]: E1202 20:19:13.567347 4807 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 20:19:13 crc kubenswrapper[4807]: E1202 20:19:13.567433 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift podName:caaccc0b-6743-4907-9d87-f4ab26c931e2 nodeName:}" failed. No retries permitted until 2025-12-02 20:19:15.567404976 +0000 UTC m=+1290.868312471 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift") pod "swift-storage-0" (UID: "caaccc0b-6743-4907-9d87-f4ab26c931e2") : configmap "swift-ring-files" not found Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.870526 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 02 20:19:13 crc kubenswrapper[4807]: I1202 20:19:13.918625 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.115785 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-b94t7" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.115823 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-msbf5" event={"ID":"a3dd1529-ab40-486b-8458-e3d1afc9a0e2","Type":"ContainerStarted","Data":"d3be9e7893e6b2852ac72c6275d412b24a9fb782ad6883d9fa65623ac3c0a5bb"} Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.117344 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.182524 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.184169 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-b94t7"] Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.194004 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.202810 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-b94t7"] Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.385253 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nnbmr"] Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.421473 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-xkdjx"] Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.423405 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.430516 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.442015 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-c2pcc"] Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.444680 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.448839 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.456625 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-c2pcc"] Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.474609 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-xkdjx"] Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.482336 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0ee2869c-161d-442b-81a3-b3790ab8cdfe-ovs-rundir\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.482395 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0ee2869c-161d-442b-81a3-b3790ab8cdfe-ovn-rundir\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.482469 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-config\") pod \"dnsmasq-dns-8cc7fc4dc-xkdjx\" (UID: \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.482494 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-dns-svc\") pod \"dnsmasq-dns-8cc7fc4dc-xkdjx\" (UID: \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.482545 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ee2869c-161d-442b-81a3-b3790ab8cdfe-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.482591 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-ovsdbserver-sb\") pod \"dnsmasq-dns-8cc7fc4dc-xkdjx\" (UID: \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.482632 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ee2869c-161d-442b-81a3-b3790ab8cdfe-combined-ca-bundle\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.482672 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpkw6\" (UniqueName: \"kubernetes.io/projected/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-kube-api-access-qpkw6\") pod \"dnsmasq-dns-8cc7fc4dc-xkdjx\" (UID: \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.482985 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee2869c-161d-442b-81a3-b3790ab8cdfe-config\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.483054 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75f2v\" (UniqueName: \"kubernetes.io/projected/0ee2869c-161d-442b-81a3-b3790ab8cdfe-kube-api-access-75f2v\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.585144 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-config\") pod \"dnsmasq-dns-8cc7fc4dc-xkdjx\" (UID: \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.585190 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-dns-svc\") pod \"dnsmasq-dns-8cc7fc4dc-xkdjx\" (UID: \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.585238 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ee2869c-161d-442b-81a3-b3790ab8cdfe-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.585270 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-ovsdbserver-sb\") pod \"dnsmasq-dns-8cc7fc4dc-xkdjx\" (UID: \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.585301 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ee2869c-161d-442b-81a3-b3790ab8cdfe-combined-ca-bundle\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.585324 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpkw6\" (UniqueName: \"kubernetes.io/projected/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-kube-api-access-qpkw6\") pod \"dnsmasq-dns-8cc7fc4dc-xkdjx\" (UID: \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.585375 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee2869c-161d-442b-81a3-b3790ab8cdfe-config\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.585412 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75f2v\" (UniqueName: \"kubernetes.io/projected/0ee2869c-161d-442b-81a3-b3790ab8cdfe-kube-api-access-75f2v\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.585430 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0ee2869c-161d-442b-81a3-b3790ab8cdfe-ovs-rundir\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.585450 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0ee2869c-161d-442b-81a3-b3790ab8cdfe-ovn-rundir\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.585764 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0ee2869c-161d-442b-81a3-b3790ab8cdfe-ovn-rundir\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.586525 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-config\") pod \"dnsmasq-dns-8cc7fc4dc-xkdjx\" (UID: \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.587257 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee2869c-161d-442b-81a3-b3790ab8cdfe-config\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.588239 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0ee2869c-161d-442b-81a3-b3790ab8cdfe-ovs-rundir\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.591406 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-ovsdbserver-sb\") pod \"dnsmasq-dns-8cc7fc4dc-xkdjx\" (UID: \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.594660 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-dns-svc\") pod \"dnsmasq-dns-8cc7fc4dc-xkdjx\" (UID: \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.599272 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ee2869c-161d-442b-81a3-b3790ab8cdfe-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.601292 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ee2869c-161d-442b-81a3-b3790ab8cdfe-combined-ca-bundle\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.610688 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpkw6\" (UniqueName: \"kubernetes.io/projected/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-kube-api-access-qpkw6\") pod \"dnsmasq-dns-8cc7fc4dc-xkdjx\" (UID: \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\") " pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.617346 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75f2v\" (UniqueName: \"kubernetes.io/projected/0ee2869c-161d-442b-81a3-b3790ab8cdfe-kube-api-access-75f2v\") pod \"ovn-controller-metrics-c2pcc\" (UID: \"0ee2869c-161d-442b-81a3-b3790ab8cdfe\") " pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.636921 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-fkzfp"] Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.673251 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-tdkbv"] Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.674731 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.685285 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.687760 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-tdkbv"] Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.727331 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.728834 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.745484 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.745758 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-fqhrt" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.745989 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.747767 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.764217 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.773832 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.786384 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-c2pcc" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.792793 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-config\") pod \"dnsmasq-dns-b8fbc5445-tdkbv\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.792868 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxfcp\" (UniqueName: \"kubernetes.io/projected/9e632286-c5c3-48a0-a79c-445edea3b864-kube-api-access-pxfcp\") pod \"dnsmasq-dns-b8fbc5445-tdkbv\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.792906 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-tdkbv\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.792940 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-tdkbv\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.792979 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-tdkbv\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.895048 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-tdkbv\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.895132 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk4rz\" (UniqueName: \"kubernetes.io/projected/06539675-3505-4f57-bdfd-54ccdb96d90a-kube-api-access-bk4rz\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.895182 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/06539675-3505-4f57-bdfd-54ccdb96d90a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.895214 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/06539675-3505-4f57-bdfd-54ccdb96d90a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.895273 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06539675-3505-4f57-bdfd-54ccdb96d90a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.895305 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06539675-3505-4f57-bdfd-54ccdb96d90a-config\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.895332 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06539675-3505-4f57-bdfd-54ccdb96d90a-scripts\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.895380 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/06539675-3505-4f57-bdfd-54ccdb96d90a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.895429 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-config\") pod \"dnsmasq-dns-b8fbc5445-tdkbv\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.895456 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxfcp\" (UniqueName: \"kubernetes.io/projected/9e632286-c5c3-48a0-a79c-445edea3b864-kube-api-access-pxfcp\") pod \"dnsmasq-dns-b8fbc5445-tdkbv\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.895494 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-tdkbv\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.895540 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-tdkbv\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.896953 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-tdkbv\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.897685 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-tdkbv\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.898627 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-config\") pod \"dnsmasq-dns-b8fbc5445-tdkbv\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.903346 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-tdkbv\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.919073 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxfcp\" (UniqueName: \"kubernetes.io/projected/9e632286-c5c3-48a0-a79c-445edea3b864-kube-api-access-pxfcp\") pod \"dnsmasq-dns-b8fbc5445-tdkbv\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.996409 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee2880b4-fddf-40fa-a760-af84e276af14" path="/var/lib/kubelet/pods/ee2880b4-fddf-40fa-a760-af84e276af14/volumes" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.997618 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06539675-3505-4f57-bdfd-54ccdb96d90a-scripts\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.997653 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/06539675-3505-4f57-bdfd-54ccdb96d90a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.997840 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk4rz\" (UniqueName: \"kubernetes.io/projected/06539675-3505-4f57-bdfd-54ccdb96d90a-kube-api-access-bk4rz\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.997861 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/06539675-3505-4f57-bdfd-54ccdb96d90a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.997878 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/06539675-3505-4f57-bdfd-54ccdb96d90a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.997914 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06539675-3505-4f57-bdfd-54ccdb96d90a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:14 crc kubenswrapper[4807]: I1202 20:19:14.997934 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06539675-3505-4f57-bdfd-54ccdb96d90a-config\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.001354 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06539675-3505-4f57-bdfd-54ccdb96d90a-scripts\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.001663 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/06539675-3505-4f57-bdfd-54ccdb96d90a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.003943 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.004617 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/06539675-3505-4f57-bdfd-54ccdb96d90a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.007052 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06539675-3505-4f57-bdfd-54ccdb96d90a-config\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.019661 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/06539675-3505-4f57-bdfd-54ccdb96d90a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.025663 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.025902 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk4rz\" (UniqueName: \"kubernetes.io/projected/06539675-3505-4f57-bdfd-54ccdb96d90a-kube-api-access-bk4rz\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.028454 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06539675-3505-4f57-bdfd-54ccdb96d90a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"06539675-3505-4f57-bdfd-54ccdb96d90a\") " pod="openstack/ovn-northd-0" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.079924 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.099962 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c99b3b4-e736-49f9-a876-7c72a2cc021c-config\") pod \"1c99b3b4-e736-49f9-a876-7c72a2cc021c\" (UID: \"1c99b3b4-e736-49f9-a876-7c72a2cc021c\") " Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.100025 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tdtl\" (UniqueName: \"kubernetes.io/projected/1c99b3b4-e736-49f9-a876-7c72a2cc021c-kube-api-access-4tdtl\") pod \"1c99b3b4-e736-49f9-a876-7c72a2cc021c\" (UID: \"1c99b3b4-e736-49f9-a876-7c72a2cc021c\") " Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.100093 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c99b3b4-e736-49f9-a876-7c72a2cc021c-dns-svc\") pod \"1c99b3b4-e736-49f9-a876-7c72a2cc021c\" (UID: \"1c99b3b4-e736-49f9-a876-7c72a2cc021c\") " Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.124970 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c99b3b4-e736-49f9-a876-7c72a2cc021c-kube-api-access-4tdtl" (OuterVolumeSpecName: "kube-api-access-4tdtl") pod "1c99b3b4-e736-49f9-a876-7c72a2cc021c" (UID: "1c99b3b4-e736-49f9-a876-7c72a2cc021c"). InnerVolumeSpecName "kube-api-access-4tdtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.141306 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" event={"ID":"1c99b3b4-e736-49f9-a876-7c72a2cc021c","Type":"ContainerDied","Data":"a28fbf5e1a34b7f6e9cc7173865cdcd9c9b5c065c7098658726143c36da04c2c"} Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.141399 4807 scope.go:117] "RemoveContainer" containerID="0947563239a4735425fb08760cae661a1b66b47225f9a03820c8226aed5fc7d6" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.141542 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nnbmr" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.144511 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" podUID="18a453c8-e7f8-4c78-814c-d1d9f79320b5" containerName="dnsmasq-dns" containerID="cri-o://26fd402d7279cd560b92e936f5ce08e9d54081c5dc88f24e5e423127e3bfa8ba" gracePeriod=10 Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.218875 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tdtl\" (UniqueName: \"kubernetes.io/projected/1c99b3b4-e736-49f9-a876-7c72a2cc021c-kube-api-access-4tdtl\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.297838 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c99b3b4-e736-49f9-a876-7c72a2cc021c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1c99b3b4-e736-49f9-a876-7c72a2cc021c" (UID: "1c99b3b4-e736-49f9-a876-7c72a2cc021c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.327220 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c99b3b4-e736-49f9-a876-7c72a2cc021c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.343090 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c99b3b4-e736-49f9-a876-7c72a2cc021c-config" (OuterVolumeSpecName: "config") pod "1c99b3b4-e736-49f9-a876-7c72a2cc021c" (UID: "1c99b3b4-e736-49f9-a876-7c72a2cc021c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.364730 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-c2pcc"] Dec 02 20:19:15 crc kubenswrapper[4807]: W1202 20:19:15.375756 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ee2869c_161d_442b_81a3_b3790ab8cdfe.slice/crio-17ea0cba10f1adc5a997a065dc18c0f4b6a5b6fbcbd75a08988dd051b00e4b4c WatchSource:0}: Error finding container 17ea0cba10f1adc5a997a065dc18c0f4b6a5b6fbcbd75a08988dd051b00e4b4c: Status 404 returned error can't find the container with id 17ea0cba10f1adc5a997a065dc18c0f4b6a5b6fbcbd75a08988dd051b00e4b4c Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.428547 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c99b3b4-e736-49f9-a876-7c72a2cc021c-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.561236 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nnbmr"] Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.569514 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nnbmr"] Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.633401 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:15 crc kubenswrapper[4807]: E1202 20:19:15.633659 4807 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 20:19:15 crc kubenswrapper[4807]: E1202 20:19:15.633675 4807 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 20:19:15 crc kubenswrapper[4807]: E1202 20:19:15.633757 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift podName:caaccc0b-6743-4907-9d87-f4ab26c931e2 nodeName:}" failed. No retries permitted until 2025-12-02 20:19:19.633737131 +0000 UTC m=+1294.934644626 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift") pod "swift-storage-0" (UID: "caaccc0b-6743-4907-9d87-f4ab26c931e2") : configmap "swift-ring-files" not found Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.647553 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-xkdjx"] Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.812056 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 20:19:15 crc kubenswrapper[4807]: I1202 20:19:15.922875 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.042926 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18a453c8-e7f8-4c78-814c-d1d9f79320b5-dns-svc\") pod \"18a453c8-e7f8-4c78-814c-d1d9f79320b5\" (UID: \"18a453c8-e7f8-4c78-814c-d1d9f79320b5\") " Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.042999 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5gnt\" (UniqueName: \"kubernetes.io/projected/18a453c8-e7f8-4c78-814c-d1d9f79320b5-kube-api-access-p5gnt\") pod \"18a453c8-e7f8-4c78-814c-d1d9f79320b5\" (UID: \"18a453c8-e7f8-4c78-814c-d1d9f79320b5\") " Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.043032 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18a453c8-e7f8-4c78-814c-d1d9f79320b5-config\") pod \"18a453c8-e7f8-4c78-814c-d1d9f79320b5\" (UID: \"18a453c8-e7f8-4c78-814c-d1d9f79320b5\") " Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.048257 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-tdkbv"] Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.049052 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18a453c8-e7f8-4c78-814c-d1d9f79320b5-kube-api-access-p5gnt" (OuterVolumeSpecName: "kube-api-access-p5gnt") pod "18a453c8-e7f8-4c78-814c-d1d9f79320b5" (UID: "18a453c8-e7f8-4c78-814c-d1d9f79320b5"). InnerVolumeSpecName "kube-api-access-p5gnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.084415 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18a453c8-e7f8-4c78-814c-d1d9f79320b5-config" (OuterVolumeSpecName: "config") pod "18a453c8-e7f8-4c78-814c-d1d9f79320b5" (UID: "18a453c8-e7f8-4c78-814c-d1d9f79320b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.087432 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18a453c8-e7f8-4c78-814c-d1d9f79320b5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18a453c8-e7f8-4c78-814c-d1d9f79320b5" (UID: "18a453c8-e7f8-4c78-814c-d1d9f79320b5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.145513 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18a453c8-e7f8-4c78-814c-d1d9f79320b5-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.145552 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18a453c8-e7f8-4c78-814c-d1d9f79320b5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.145564 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5gnt\" (UniqueName: \"kubernetes.io/projected/18a453c8-e7f8-4c78-814c-d1d9f79320b5-kube-api-access-p5gnt\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.152106 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"06539675-3505-4f57-bdfd-54ccdb96d90a","Type":"ContainerStarted","Data":"4f03074cbdb1145f5a8e61015d898a3d673e7df5322a4d43dd47a984098aec66"} Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.154085 4807 generic.go:334] "Generic (PLEG): container finished" podID="688a394e-b6ac-45a4-8617-1b59bcb1d7b6" containerID="1c7d641f183cba87a8f1bf64128f0b5d2c35ce3c74ffd94b0ad626df5f1a5b7d" exitCode=0 Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.154147 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" event={"ID":"688a394e-b6ac-45a4-8617-1b59bcb1d7b6","Type":"ContainerDied","Data":"1c7d641f183cba87a8f1bf64128f0b5d2c35ce3c74ffd94b0ad626df5f1a5b7d"} Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.154196 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" event={"ID":"688a394e-b6ac-45a4-8617-1b59bcb1d7b6","Type":"ContainerStarted","Data":"d890c45f54a719e9902a2bd5c8a4a10d2bce1c40e4a3766ed9df490e084fde7a"} Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.157710 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-c2pcc" event={"ID":"0ee2869c-161d-442b-81a3-b3790ab8cdfe","Type":"ContainerStarted","Data":"b83cd0db983a9e8b894f687284319744fb1c72d65bff0835f5b778a7e5f57875"} Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.157759 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-c2pcc" event={"ID":"0ee2869c-161d-442b-81a3-b3790ab8cdfe","Type":"ContainerStarted","Data":"17ea0cba10f1adc5a997a065dc18c0f4b6a5b6fbcbd75a08988dd051b00e4b4c"} Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.167095 4807 generic.go:334] "Generic (PLEG): container finished" podID="18a453c8-e7f8-4c78-814c-d1d9f79320b5" containerID="26fd402d7279cd560b92e936f5ce08e9d54081c5dc88f24e5e423127e3bfa8ba" exitCode=0 Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.167198 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.167312 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" event={"ID":"18a453c8-e7f8-4c78-814c-d1d9f79320b5","Type":"ContainerDied","Data":"26fd402d7279cd560b92e936f5ce08e9d54081c5dc88f24e5e423127e3bfa8ba"} Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.167407 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-fkzfp" event={"ID":"18a453c8-e7f8-4c78-814c-d1d9f79320b5","Type":"ContainerDied","Data":"00f39d22b1fc96901e3533136645c981710a52c223b3de0945b4aab9c3ab00a0"} Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.167455 4807 scope.go:117] "RemoveContainer" containerID="26fd402d7279cd560b92e936f5ce08e9d54081c5dc88f24e5e423127e3bfa8ba" Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.240124 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-c2pcc" podStartSLOduration=2.240095624 podStartE2EDuration="2.240095624s" podCreationTimestamp="2025-12-02 20:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:19:16.212075729 +0000 UTC m=+1291.512983224" watchObservedRunningTime="2025-12-02 20:19:16.240095624 +0000 UTC m=+1291.541003119" Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.270958 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-fkzfp"] Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.277999 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-fkzfp"] Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.740130 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.740422 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.983633 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18a453c8-e7f8-4c78-814c-d1d9f79320b5" path="/var/lib/kubelet/pods/18a453c8-e7f8-4c78-814c-d1d9f79320b5/volumes" Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.984249 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c99b3b4-e736-49f9-a876-7c72a2cc021c" path="/var/lib/kubelet/pods/1c99b3b4-e736-49f9-a876-7c72a2cc021c/volumes" Dec 02 20:19:16 crc kubenswrapper[4807]: I1202 20:19:16.990870 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.270049 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.590398 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-23da-account-create-update-6mqn7"] Dec 02 20:19:17 crc kubenswrapper[4807]: E1202 20:19:17.590887 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a453c8-e7f8-4c78-814c-d1d9f79320b5" containerName="init" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.590908 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a453c8-e7f8-4c78-814c-d1d9f79320b5" containerName="init" Dec 02 20:19:17 crc kubenswrapper[4807]: E1202 20:19:17.590936 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c99b3b4-e736-49f9-a876-7c72a2cc021c" containerName="init" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.590945 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c99b3b4-e736-49f9-a876-7c72a2cc021c" containerName="init" Dec 02 20:19:17 crc kubenswrapper[4807]: E1202 20:19:17.590964 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a453c8-e7f8-4c78-814c-d1d9f79320b5" containerName="dnsmasq-dns" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.590971 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a453c8-e7f8-4c78-814c-d1d9f79320b5" containerName="dnsmasq-dns" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.591190 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c99b3b4-e736-49f9-a876-7c72a2cc021c" containerName="init" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.591213 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="18a453c8-e7f8-4c78-814c-d1d9f79320b5" containerName="dnsmasq-dns" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.592049 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-23da-account-create-update-6mqn7" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.595614 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.602022 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-23da-account-create-update-6mqn7"] Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.653205 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-2wbvt"] Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.654333 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2wbvt" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.672528 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2wbvt"] Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.688432 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9n2n\" (UniqueName: \"kubernetes.io/projected/6b603338-b25a-435a-b8a8-32b3aa1791c8-kube-api-access-v9n2n\") pod \"keystone-23da-account-create-update-6mqn7\" (UID: \"6b603338-b25a-435a-b8a8-32b3aa1791c8\") " pod="openstack/keystone-23da-account-create-update-6mqn7" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.688580 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b603338-b25a-435a-b8a8-32b3aa1791c8-operator-scripts\") pod \"keystone-23da-account-create-update-6mqn7\" (UID: \"6b603338-b25a-435a-b8a8-32b3aa1791c8\") " pod="openstack/keystone-23da-account-create-update-6mqn7" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.790671 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9n2n\" (UniqueName: \"kubernetes.io/projected/6b603338-b25a-435a-b8a8-32b3aa1791c8-kube-api-access-v9n2n\") pod \"keystone-23da-account-create-update-6mqn7\" (UID: \"6b603338-b25a-435a-b8a8-32b3aa1791c8\") " pod="openstack/keystone-23da-account-create-update-6mqn7" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.790892 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09c59d51-437b-44fc-a75a-d36757cbb08a-operator-scripts\") pod \"keystone-db-create-2wbvt\" (UID: \"09c59d51-437b-44fc-a75a-d36757cbb08a\") " pod="openstack/keystone-db-create-2wbvt" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.790982 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b603338-b25a-435a-b8a8-32b3aa1791c8-operator-scripts\") pod \"keystone-23da-account-create-update-6mqn7\" (UID: \"6b603338-b25a-435a-b8a8-32b3aa1791c8\") " pod="openstack/keystone-23da-account-create-update-6mqn7" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.791031 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-648s6\" (UniqueName: \"kubernetes.io/projected/09c59d51-437b-44fc-a75a-d36757cbb08a-kube-api-access-648s6\") pod \"keystone-db-create-2wbvt\" (UID: \"09c59d51-437b-44fc-a75a-d36757cbb08a\") " pod="openstack/keystone-db-create-2wbvt" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.792107 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b603338-b25a-435a-b8a8-32b3aa1791c8-operator-scripts\") pod \"keystone-23da-account-create-update-6mqn7\" (UID: \"6b603338-b25a-435a-b8a8-32b3aa1791c8\") " pod="openstack/keystone-23da-account-create-update-6mqn7" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.814914 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9n2n\" (UniqueName: \"kubernetes.io/projected/6b603338-b25a-435a-b8a8-32b3aa1791c8-kube-api-access-v9n2n\") pod \"keystone-23da-account-create-update-6mqn7\" (UID: \"6b603338-b25a-435a-b8a8-32b3aa1791c8\") " pod="openstack/keystone-23da-account-create-update-6mqn7" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.892178 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09c59d51-437b-44fc-a75a-d36757cbb08a-operator-scripts\") pod \"keystone-db-create-2wbvt\" (UID: \"09c59d51-437b-44fc-a75a-d36757cbb08a\") " pod="openstack/keystone-db-create-2wbvt" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.892471 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-648s6\" (UniqueName: \"kubernetes.io/projected/09c59d51-437b-44fc-a75a-d36757cbb08a-kube-api-access-648s6\") pod \"keystone-db-create-2wbvt\" (UID: \"09c59d51-437b-44fc-a75a-d36757cbb08a\") " pod="openstack/keystone-db-create-2wbvt" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.893432 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09c59d51-437b-44fc-a75a-d36757cbb08a-operator-scripts\") pod \"keystone-db-create-2wbvt\" (UID: \"09c59d51-437b-44fc-a75a-d36757cbb08a\") " pod="openstack/keystone-db-create-2wbvt" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.910525 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-23da-account-create-update-6mqn7" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.914027 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-648s6\" (UniqueName: \"kubernetes.io/projected/09c59d51-437b-44fc-a75a-d36757cbb08a-kube-api-access-648s6\") pod \"keystone-db-create-2wbvt\" (UID: \"09c59d51-437b-44fc-a75a-d36757cbb08a\") " pod="openstack/keystone-db-create-2wbvt" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.961260 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-x5fkh"] Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.962817 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x5fkh" Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.970609 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-x5fkh"] Dec 02 20:19:17 crc kubenswrapper[4807]: I1202 20:19:17.971949 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2wbvt" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.027817 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.027885 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.096663 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a9bd526-4162-4ed4-8d18-f7dba71eec1f-operator-scripts\") pod \"placement-db-create-x5fkh\" (UID: \"3a9bd526-4162-4ed4-8d18-f7dba71eec1f\") " pod="openstack/placement-db-create-x5fkh" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.096734 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtwzp\" (UniqueName: \"kubernetes.io/projected/3a9bd526-4162-4ed4-8d18-f7dba71eec1f-kube-api-access-xtwzp\") pod \"placement-db-create-x5fkh\" (UID: \"3a9bd526-4162-4ed4-8d18-f7dba71eec1f\") " pod="openstack/placement-db-create-x5fkh" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.117276 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.174041 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3e9b-account-create-update-6hr9w"] Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.176634 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3e9b-account-create-update-6hr9w" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.180386 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.185187 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3e9b-account-create-update-6hr9w"] Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.198408 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a9bd526-4162-4ed4-8d18-f7dba71eec1f-operator-scripts\") pod \"placement-db-create-x5fkh\" (UID: \"3a9bd526-4162-4ed4-8d18-f7dba71eec1f\") " pod="openstack/placement-db-create-x5fkh" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.198463 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtwzp\" (UniqueName: \"kubernetes.io/projected/3a9bd526-4162-4ed4-8d18-f7dba71eec1f-kube-api-access-xtwzp\") pod \"placement-db-create-x5fkh\" (UID: \"3a9bd526-4162-4ed4-8d18-f7dba71eec1f\") " pod="openstack/placement-db-create-x5fkh" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.202810 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a9bd526-4162-4ed4-8d18-f7dba71eec1f-operator-scripts\") pod \"placement-db-create-x5fkh\" (UID: \"3a9bd526-4162-4ed4-8d18-f7dba71eec1f\") " pod="openstack/placement-db-create-x5fkh" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.241487 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtwzp\" (UniqueName: \"kubernetes.io/projected/3a9bd526-4162-4ed4-8d18-f7dba71eec1f-kube-api-access-xtwzp\") pod \"placement-db-create-x5fkh\" (UID: \"3a9bd526-4162-4ed4-8d18-f7dba71eec1f\") " pod="openstack/placement-db-create-x5fkh" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.283691 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x5fkh" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.300355 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkwcw\" (UniqueName: \"kubernetes.io/projected/5b4b20e6-d557-429a-b63f-8d51312805c9-kube-api-access-gkwcw\") pod \"placement-3e9b-account-create-update-6hr9w\" (UID: \"5b4b20e6-d557-429a-b63f-8d51312805c9\") " pod="openstack/placement-3e9b-account-create-update-6hr9w" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.300412 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b4b20e6-d557-429a-b63f-8d51312805c9-operator-scripts\") pod \"placement-3e9b-account-create-update-6hr9w\" (UID: \"5b4b20e6-d557-429a-b63f-8d51312805c9\") " pod="openstack/placement-3e9b-account-create-update-6hr9w" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.329468 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.406710 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkwcw\" (UniqueName: \"kubernetes.io/projected/5b4b20e6-d557-429a-b63f-8d51312805c9-kube-api-access-gkwcw\") pod \"placement-3e9b-account-create-update-6hr9w\" (UID: \"5b4b20e6-d557-429a-b63f-8d51312805c9\") " pod="openstack/placement-3e9b-account-create-update-6hr9w" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.406782 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b4b20e6-d557-429a-b63f-8d51312805c9-operator-scripts\") pod \"placement-3e9b-account-create-update-6hr9w\" (UID: \"5b4b20e6-d557-429a-b63f-8d51312805c9\") " pod="openstack/placement-3e9b-account-create-update-6hr9w" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.408494 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b4b20e6-d557-429a-b63f-8d51312805c9-operator-scripts\") pod \"placement-3e9b-account-create-update-6hr9w\" (UID: \"5b4b20e6-d557-429a-b63f-8d51312805c9\") " pod="openstack/placement-3e9b-account-create-update-6hr9w" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.427995 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkwcw\" (UniqueName: \"kubernetes.io/projected/5b4b20e6-d557-429a-b63f-8d51312805c9-kube-api-access-gkwcw\") pod \"placement-3e9b-account-create-update-6hr9w\" (UID: \"5b4b20e6-d557-429a-b63f-8d51312805c9\") " pod="openstack/placement-3e9b-account-create-update-6hr9w" Dec 02 20:19:18 crc kubenswrapper[4807]: I1202 20:19:18.504493 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3e9b-account-create-update-6hr9w" Dec 02 20:19:18 crc kubenswrapper[4807]: W1202 20:19:18.978388 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e632286_c5c3_48a0_a79c_445edea3b864.slice/crio-74fe31fead1f31289e2dc91fe484501301fe27c8a1cb5d5ac884a39d5eeeeed2 WatchSource:0}: Error finding container 74fe31fead1f31289e2dc91fe484501301fe27c8a1cb5d5ac884a39d5eeeeed2: Status 404 returned error can't find the container with id 74fe31fead1f31289e2dc91fe484501301fe27c8a1cb5d5ac884a39d5eeeeed2 Dec 02 20:19:19 crc kubenswrapper[4807]: I1202 20:19:19.230862 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" event={"ID":"9e632286-c5c3-48a0-a79c-445edea3b864","Type":"ContainerStarted","Data":"74fe31fead1f31289e2dc91fe484501301fe27c8a1cb5d5ac884a39d5eeeeed2"} Dec 02 20:19:19 crc kubenswrapper[4807]: I1202 20:19:19.633804 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:19 crc kubenswrapper[4807]: E1202 20:19:19.633970 4807 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 20:19:19 crc kubenswrapper[4807]: E1202 20:19:19.633987 4807 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 20:19:19 crc kubenswrapper[4807]: E1202 20:19:19.634045 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift podName:caaccc0b-6743-4907-9d87-f4ab26c931e2 nodeName:}" failed. No retries permitted until 2025-12-02 20:19:27.634026897 +0000 UTC m=+1302.934934392 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift") pod "swift-storage-0" (UID: "caaccc0b-6743-4907-9d87-f4ab26c931e2") : configmap "swift-ring-files" not found Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.171413 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-6dkkn"] Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.173264 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-6dkkn" Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.181619 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-6dkkn"] Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.245391 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx442\" (UniqueName: \"kubernetes.io/projected/08b18fa4-ea02-46c3-86ea-10e33edde0c0-kube-api-access-cx442\") pod \"watcher-db-create-6dkkn\" (UID: \"08b18fa4-ea02-46c3-86ea-10e33edde0c0\") " pod="openstack/watcher-db-create-6dkkn" Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.245489 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08b18fa4-ea02-46c3-86ea-10e33edde0c0-operator-scripts\") pod \"watcher-db-create-6dkkn\" (UID: \"08b18fa4-ea02-46c3-86ea-10e33edde0c0\") " pod="openstack/watcher-db-create-6dkkn" Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.287558 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-46a0-account-create-update-6nrzt"] Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.289480 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-46a0-account-create-update-6nrzt" Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.295968 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.307682 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-46a0-account-create-update-6nrzt"] Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.346643 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx442\" (UniqueName: \"kubernetes.io/projected/08b18fa4-ea02-46c3-86ea-10e33edde0c0-kube-api-access-cx442\") pod \"watcher-db-create-6dkkn\" (UID: \"08b18fa4-ea02-46c3-86ea-10e33edde0c0\") " pod="openstack/watcher-db-create-6dkkn" Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.346738 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08b18fa4-ea02-46c3-86ea-10e33edde0c0-operator-scripts\") pod \"watcher-db-create-6dkkn\" (UID: \"08b18fa4-ea02-46c3-86ea-10e33edde0c0\") " pod="openstack/watcher-db-create-6dkkn" Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.347482 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08b18fa4-ea02-46c3-86ea-10e33edde0c0-operator-scripts\") pod \"watcher-db-create-6dkkn\" (UID: \"08b18fa4-ea02-46c3-86ea-10e33edde0c0\") " pod="openstack/watcher-db-create-6dkkn" Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.375368 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx442\" (UniqueName: \"kubernetes.io/projected/08b18fa4-ea02-46c3-86ea-10e33edde0c0-kube-api-access-cx442\") pod \"watcher-db-create-6dkkn\" (UID: \"08b18fa4-ea02-46c3-86ea-10e33edde0c0\") " pod="openstack/watcher-db-create-6dkkn" Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.448093 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/390cf7e1-da04-4176-aa78-71446bc7cef4-operator-scripts\") pod \"watcher-46a0-account-create-update-6nrzt\" (UID: \"390cf7e1-da04-4176-aa78-71446bc7cef4\") " pod="openstack/watcher-46a0-account-create-update-6nrzt" Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.448190 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh7pv\" (UniqueName: \"kubernetes.io/projected/390cf7e1-da04-4176-aa78-71446bc7cef4-kube-api-access-rh7pv\") pod \"watcher-46a0-account-create-update-6nrzt\" (UID: \"390cf7e1-da04-4176-aa78-71446bc7cef4\") " pod="openstack/watcher-46a0-account-create-update-6nrzt" Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.493138 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-6dkkn" Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.549797 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh7pv\" (UniqueName: \"kubernetes.io/projected/390cf7e1-da04-4176-aa78-71446bc7cef4-kube-api-access-rh7pv\") pod \"watcher-46a0-account-create-update-6nrzt\" (UID: \"390cf7e1-da04-4176-aa78-71446bc7cef4\") " pod="openstack/watcher-46a0-account-create-update-6nrzt" Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.550456 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/390cf7e1-da04-4176-aa78-71446bc7cef4-operator-scripts\") pod \"watcher-46a0-account-create-update-6nrzt\" (UID: \"390cf7e1-da04-4176-aa78-71446bc7cef4\") " pod="openstack/watcher-46a0-account-create-update-6nrzt" Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.551562 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/390cf7e1-da04-4176-aa78-71446bc7cef4-operator-scripts\") pod \"watcher-46a0-account-create-update-6nrzt\" (UID: \"390cf7e1-da04-4176-aa78-71446bc7cef4\") " pod="openstack/watcher-46a0-account-create-update-6nrzt" Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.570800 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh7pv\" (UniqueName: \"kubernetes.io/projected/390cf7e1-da04-4176-aa78-71446bc7cef4-kube-api-access-rh7pv\") pod \"watcher-46a0-account-create-update-6nrzt\" (UID: \"390cf7e1-da04-4176-aa78-71446bc7cef4\") " pod="openstack/watcher-46a0-account-create-update-6nrzt" Dec 02 20:19:20 crc kubenswrapper[4807]: I1202 20:19:20.617710 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-46a0-account-create-update-6nrzt" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.346371 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-2vb9g"] Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.347858 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2vb9g" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.367449 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2vb9g"] Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.403292 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wprqw\" (UniqueName: \"kubernetes.io/projected/d8927ec0-1565-476c-9625-462a7d198c4e-kube-api-access-wprqw\") pod \"glance-db-create-2vb9g\" (UID: \"d8927ec0-1565-476c-9625-462a7d198c4e\") " pod="openstack/glance-db-create-2vb9g" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.403445 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8927ec0-1565-476c-9625-462a7d198c4e-operator-scripts\") pod \"glance-db-create-2vb9g\" (UID: \"d8927ec0-1565-476c-9625-462a7d198c4e\") " pod="openstack/glance-db-create-2vb9g" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.454667 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-54fc-account-create-update-c9zhp"] Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.455893 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-54fc-account-create-update-c9zhp" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.458348 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.466165 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-54fc-account-create-update-c9zhp"] Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.505536 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8927ec0-1565-476c-9625-462a7d198c4e-operator-scripts\") pod \"glance-db-create-2vb9g\" (UID: \"d8927ec0-1565-476c-9625-462a7d198c4e\") " pod="openstack/glance-db-create-2vb9g" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.505679 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wprqw\" (UniqueName: \"kubernetes.io/projected/d8927ec0-1565-476c-9625-462a7d198c4e-kube-api-access-wprqw\") pod \"glance-db-create-2vb9g\" (UID: \"d8927ec0-1565-476c-9625-462a7d198c4e\") " pod="openstack/glance-db-create-2vb9g" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.506611 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8927ec0-1565-476c-9625-462a7d198c4e-operator-scripts\") pod \"glance-db-create-2vb9g\" (UID: \"d8927ec0-1565-476c-9625-462a7d198c4e\") " pod="openstack/glance-db-create-2vb9g" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.524992 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wprqw\" (UniqueName: \"kubernetes.io/projected/d8927ec0-1565-476c-9625-462a7d198c4e-kube-api-access-wprqw\") pod \"glance-db-create-2vb9g\" (UID: \"d8927ec0-1565-476c-9625-462a7d198c4e\") " pod="openstack/glance-db-create-2vb9g" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.608008 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30197c83-6070-4ace-b56a-cf82f143ffb5-operator-scripts\") pod \"glance-54fc-account-create-update-c9zhp\" (UID: \"30197c83-6070-4ace-b56a-cf82f143ffb5\") " pod="openstack/glance-54fc-account-create-update-c9zhp" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.608712 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjndd\" (UniqueName: \"kubernetes.io/projected/30197c83-6070-4ace-b56a-cf82f143ffb5-kube-api-access-hjndd\") pod \"glance-54fc-account-create-update-c9zhp\" (UID: \"30197c83-6070-4ace-b56a-cf82f143ffb5\") " pod="openstack/glance-54fc-account-create-update-c9zhp" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.627012 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-d5xdx" podUID="ed377bf7-d0c1-45d0-bad2-948f4bde39aa" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.668565 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2vb9g" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.711617 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30197c83-6070-4ace-b56a-cf82f143ffb5-operator-scripts\") pod \"glance-54fc-account-create-update-c9zhp\" (UID: \"30197c83-6070-4ace-b56a-cf82f143ffb5\") " pod="openstack/glance-54fc-account-create-update-c9zhp" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.711859 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjndd\" (UniqueName: \"kubernetes.io/projected/30197c83-6070-4ace-b56a-cf82f143ffb5-kube-api-access-hjndd\") pod \"glance-54fc-account-create-update-c9zhp\" (UID: \"30197c83-6070-4ace-b56a-cf82f143ffb5\") " pod="openstack/glance-54fc-account-create-update-c9zhp" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.714211 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30197c83-6070-4ace-b56a-cf82f143ffb5-operator-scripts\") pod \"glance-54fc-account-create-update-c9zhp\" (UID: \"30197c83-6070-4ace-b56a-cf82f143ffb5\") " pod="openstack/glance-54fc-account-create-update-c9zhp" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.747812 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjndd\" (UniqueName: \"kubernetes.io/projected/30197c83-6070-4ace-b56a-cf82f143ffb5-kube-api-access-hjndd\") pod \"glance-54fc-account-create-update-c9zhp\" (UID: \"30197c83-6070-4ace-b56a-cf82f143ffb5\") " pod="openstack/glance-54fc-account-create-update-c9zhp" Dec 02 20:19:23 crc kubenswrapper[4807]: I1202 20:19:23.776616 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-54fc-account-create-update-c9zhp" Dec 02 20:19:25 crc kubenswrapper[4807]: E1202 20:19:25.262603 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef1b0038_6b67_47e8_92e8_06efe88df856.slice/crio-conmon-7f285fca8e0cd1cf35353ce537c1073826621a4833a365e1187dd2c1899b4466.scope\": RecentStats: unable to find data in memory cache]" Dec 02 20:19:25 crc kubenswrapper[4807]: I1202 20:19:25.301144 4807 generic.go:334] "Generic (PLEG): container finished" podID="ef1b0038-6b67-47e8-92e8-06efe88df856" containerID="7f285fca8e0cd1cf35353ce537c1073826621a4833a365e1187dd2c1899b4466" exitCode=0 Dec 02 20:19:25 crc kubenswrapper[4807]: I1202 20:19:25.301215 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef1b0038-6b67-47e8-92e8-06efe88df856","Type":"ContainerDied","Data":"7f285fca8e0cd1cf35353ce537c1073826621a4833a365e1187dd2c1899b4466"} Dec 02 20:19:25 crc kubenswrapper[4807]: I1202 20:19:25.303046 4807 generic.go:334] "Generic (PLEG): container finished" podID="2af53e2b-bee9-4bd4-aab6-e58d33057ac7" containerID="3a29642ba9086cef7d3fa427a48598985d20fd9275fb14e394c2f925dbccb1cb" exitCode=0 Dec 02 20:19:25 crc kubenswrapper[4807]: I1202 20:19:25.303088 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2af53e2b-bee9-4bd4-aab6-e58d33057ac7","Type":"ContainerDied","Data":"3a29642ba9086cef7d3fa427a48598985d20fd9275fb14e394c2f925dbccb1cb"} Dec 02 20:19:25 crc kubenswrapper[4807]: I1202 20:19:25.818932 4807 scope.go:117] "RemoveContainer" containerID="bc5db0d31429d1be5c9b239c4c336fb8e27d6756a85325dac8fde8af9bc0c300" Dec 02 20:19:26 crc kubenswrapper[4807]: I1202 20:19:26.242881 4807 scope.go:117] "RemoveContainer" containerID="26fd402d7279cd560b92e936f5ce08e9d54081c5dc88f24e5e423127e3bfa8ba" Dec 02 20:19:26 crc kubenswrapper[4807]: E1202 20:19:26.243832 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26fd402d7279cd560b92e936f5ce08e9d54081c5dc88f24e5e423127e3bfa8ba\": container with ID starting with 26fd402d7279cd560b92e936f5ce08e9d54081c5dc88f24e5e423127e3bfa8ba not found: ID does not exist" containerID="26fd402d7279cd560b92e936f5ce08e9d54081c5dc88f24e5e423127e3bfa8ba" Dec 02 20:19:26 crc kubenswrapper[4807]: I1202 20:19:26.243886 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26fd402d7279cd560b92e936f5ce08e9d54081c5dc88f24e5e423127e3bfa8ba"} err="failed to get container status \"26fd402d7279cd560b92e936f5ce08e9d54081c5dc88f24e5e423127e3bfa8ba\": rpc error: code = NotFound desc = could not find container \"26fd402d7279cd560b92e936f5ce08e9d54081c5dc88f24e5e423127e3bfa8ba\": container with ID starting with 26fd402d7279cd560b92e936f5ce08e9d54081c5dc88f24e5e423127e3bfa8ba not found: ID does not exist" Dec 02 20:19:26 crc kubenswrapper[4807]: I1202 20:19:26.243922 4807 scope.go:117] "RemoveContainer" containerID="bc5db0d31429d1be5c9b239c4c336fb8e27d6756a85325dac8fde8af9bc0c300" Dec 02 20:19:26 crc kubenswrapper[4807]: E1202 20:19:26.244285 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc5db0d31429d1be5c9b239c4c336fb8e27d6756a85325dac8fde8af9bc0c300\": container with ID starting with bc5db0d31429d1be5c9b239c4c336fb8e27d6756a85325dac8fde8af9bc0c300 not found: ID does not exist" containerID="bc5db0d31429d1be5c9b239c4c336fb8e27d6756a85325dac8fde8af9bc0c300" Dec 02 20:19:26 crc kubenswrapper[4807]: I1202 20:19:26.244317 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc5db0d31429d1be5c9b239c4c336fb8e27d6756a85325dac8fde8af9bc0c300"} err="failed to get container status \"bc5db0d31429d1be5c9b239c4c336fb8e27d6756a85325dac8fde8af9bc0c300\": rpc error: code = NotFound desc = could not find container \"bc5db0d31429d1be5c9b239c4c336fb8e27d6756a85325dac8fde8af9bc0c300\": container with ID starting with bc5db0d31429d1be5c9b239c4c336fb8e27d6756a85325dac8fde8af9bc0c300 not found: ID does not exist" Dec 02 20:19:26 crc kubenswrapper[4807]: I1202 20:19:26.295970 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3e9b-account-create-update-6hr9w"] Dec 02 20:19:26 crc kubenswrapper[4807]: W1202 20:19:26.383696 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b4b20e6_d557_429a_b63f_8d51312805c9.slice/crio-2f4907c5a2d61cfefff14185ca7e96745f0a8f25d5a9c57ccdc701884a4f0d36 WatchSource:0}: Error finding container 2f4907c5a2d61cfefff14185ca7e96745f0a8f25d5a9c57ccdc701884a4f0d36: Status 404 returned error can't find the container with id 2f4907c5a2d61cfefff14185ca7e96745f0a8f25d5a9c57ccdc701884a4f0d36 Dec 02 20:19:26 crc kubenswrapper[4807]: I1202 20:19:26.456500 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2wbvt"] Dec 02 20:19:26 crc kubenswrapper[4807]: I1202 20:19:26.686992 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-x5fkh"] Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.008607 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-46a0-account-create-update-6nrzt"] Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.008937 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-23da-account-create-update-6mqn7"] Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.012130 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-54fc-account-create-update-c9zhp"] Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.123199 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2vb9g"] Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.154309 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-6dkkn"] Dec 02 20:19:27 crc kubenswrapper[4807]: W1202 20:19:27.161664 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8927ec0_1565_476c_9625_462a7d198c4e.slice/crio-e49764c46213b68e310f1d227109f39e911df28c714ea425f01c3b8a29b16a1d WatchSource:0}: Error finding container e49764c46213b68e310f1d227109f39e911df28c714ea425f01c3b8a29b16a1d: Status 404 returned error can't find the container with id e49764c46213b68e310f1d227109f39e911df28c714ea425f01c3b8a29b16a1d Dec 02 20:19:27 crc kubenswrapper[4807]: W1202 20:19:27.210026 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08b18fa4_ea02_46c3_86ea_10e33edde0c0.slice/crio-bba1b06808659b26e0d1aeceb77e7461f77abba77e624a7caca04582ce5a308f WatchSource:0}: Error finding container bba1b06808659b26e0d1aeceb77e7461f77abba77e624a7caca04582ce5a308f: Status 404 returned error can't find the container with id bba1b06808659b26e0d1aeceb77e7461f77abba77e624a7caca04582ce5a308f Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.330706 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-x5fkh" event={"ID":"3a9bd526-4162-4ed4-8d18-f7dba71eec1f","Type":"ContainerStarted","Data":"6a9a1d5b0ac3fc0e884dc4c2abb9d78ad227bf951512f56dae0ced18490accb9"} Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.332395 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2vb9g" event={"ID":"d8927ec0-1565-476c-9625-462a7d198c4e","Type":"ContainerStarted","Data":"e49764c46213b68e310f1d227109f39e911df28c714ea425f01c3b8a29b16a1d"} Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.344623 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-msbf5" event={"ID":"a3dd1529-ab40-486b-8458-e3d1afc9a0e2","Type":"ContainerStarted","Data":"71f6b872d77b94a0995c498bad5b0c3ad377f9739102d6f20bee41f32571bd52"} Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.349290 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2wbvt" event={"ID":"09c59d51-437b-44fc-a75a-d36757cbb08a","Type":"ContainerStarted","Data":"fe914b422c537b12719dcbad005a5ceb635a1872eec8403f0dc60255397c21b4"} Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.358795 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2af53e2b-bee9-4bd4-aab6-e58d33057ac7","Type":"ContainerStarted","Data":"c1ee0e43b19e0a520fcaf9c1c2c70a52940c36cf3697ab9aee25e7039720e789"} Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.359047 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.361485 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-54fc-account-create-update-c9zhp" event={"ID":"30197c83-6070-4ace-b56a-cf82f143ffb5","Type":"ContainerStarted","Data":"e3e1387d1c93ddf31bce91d338a8fcce635b2dd85b6670f8fcf6dba974fbb080"} Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.364157 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89cb183e-cc2d-4fd9-90d8-212c434f0d06","Type":"ContainerStarted","Data":"ba542f9305c01578635207aaf81a2b12ce15b5cc9f58aaffe5924a71caa2fd08"} Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.375040 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef1b0038-6b67-47e8-92e8-06efe88df856","Type":"ContainerStarted","Data":"0af067dc0eb81760a184db00d47c1e1ecb454638303485d36be2f0bf5370e0f8"} Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.375555 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.377117 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-msbf5" podStartSLOduration=2.315058218 podStartE2EDuration="15.377097759s" podCreationTimestamp="2025-12-02 20:19:12 +0000 UTC" firstStartedPulling="2025-12-02 20:19:13.12108972 +0000 UTC m=+1288.421997215" lastFinishedPulling="2025-12-02 20:19:26.183129271 +0000 UTC m=+1301.484036756" observedRunningTime="2025-12-02 20:19:27.364588996 +0000 UTC m=+1302.665496511" watchObservedRunningTime="2025-12-02 20:19:27.377097759 +0000 UTC m=+1302.678005284" Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.381498 4807 generic.go:334] "Generic (PLEG): container finished" podID="9e632286-c5c3-48a0-a79c-445edea3b864" containerID="745648b60cf2543ae1e6dffbb9bcdb776ae860287454a955f890a3604dca454d" exitCode=0 Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.381580 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" event={"ID":"9e632286-c5c3-48a0-a79c-445edea3b864","Type":"ContainerDied","Data":"745648b60cf2543ae1e6dffbb9bcdb776ae860287454a955f890a3604dca454d"} Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.387060 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-6dkkn" event={"ID":"08b18fa4-ea02-46c3-86ea-10e33edde0c0","Type":"ContainerStarted","Data":"bba1b06808659b26e0d1aeceb77e7461f77abba77e624a7caca04582ce5a308f"} Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.389233 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-23da-account-create-update-6mqn7" event={"ID":"6b603338-b25a-435a-b8a8-32b3aa1791c8","Type":"ContainerStarted","Data":"8b74226e2c406fe3b0c62a31daf16045e251bd247eb3ea39ea622e8e01374373"} Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.397450 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-46a0-account-create-update-6nrzt" event={"ID":"390cf7e1-da04-4176-aa78-71446bc7cef4","Type":"ContainerStarted","Data":"29db8e6ab8005bc69fc3e770c5a5bb4801671e7fa4f2a328938876692b8c851e"} Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.398985 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=53.20271768 podStartE2EDuration="1m4.39896403s" podCreationTimestamp="2025-12-02 20:18:23 +0000 UTC" firstStartedPulling="2025-12-02 20:18:38.493876825 +0000 UTC m=+1253.794784320" lastFinishedPulling="2025-12-02 20:18:49.690123175 +0000 UTC m=+1264.991030670" observedRunningTime="2025-12-02 20:19:27.393810176 +0000 UTC m=+1302.694717681" watchObservedRunningTime="2025-12-02 20:19:27.39896403 +0000 UTC m=+1302.699871525" Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.401229 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"06539675-3505-4f57-bdfd-54ccdb96d90a","Type":"ContainerStarted","Data":"91279b73ea6115370cc9247a2ce50d02bf35e1703cef3756e9b911b8a062c68b"} Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.404178 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" event={"ID":"688a394e-b6ac-45a4-8617-1b59bcb1d7b6","Type":"ContainerStarted","Data":"d086eb7ac5b21eacea20e34602fca073925e7b8ecd5182415e4946da99f91ce8"} Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.404460 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.405406 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3e9b-account-create-update-6hr9w" event={"ID":"5b4b20e6-d557-429a-b63f-8d51312805c9","Type":"ContainerStarted","Data":"2f4907c5a2d61cfefff14185ca7e96745f0a8f25d5a9c57ccdc701884a4f0d36"} Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.479111 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" podStartSLOduration=13.479091567 podStartE2EDuration="13.479091567s" podCreationTimestamp="2025-12-02 20:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:19:27.473022636 +0000 UTC m=+1302.773930131" watchObservedRunningTime="2025-12-02 20:19:27.479091567 +0000 UTC m=+1302.779999062" Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.486198 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=53.968255725 podStartE2EDuration="1m4.486179058s" podCreationTimestamp="2025-12-02 20:18:23 +0000 UTC" firstStartedPulling="2025-12-02 20:18:39.207085781 +0000 UTC m=+1254.507993276" lastFinishedPulling="2025-12-02 20:18:49.725009114 +0000 UTC m=+1265.025916609" observedRunningTime="2025-12-02 20:19:27.456001069 +0000 UTC m=+1302.756908564" watchObservedRunningTime="2025-12-02 20:19:27.486179058 +0000 UTC m=+1302.787086553" Dec 02 20:19:27 crc kubenswrapper[4807]: I1202 20:19:27.717541 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:27 crc kubenswrapper[4807]: E1202 20:19:27.717739 4807 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 20:19:27 crc kubenswrapper[4807]: E1202 20:19:27.717775 4807 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 20:19:27 crc kubenswrapper[4807]: E1202 20:19:27.717819 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift podName:caaccc0b-6743-4907-9d87-f4ab26c931e2 nodeName:}" failed. No retries permitted until 2025-12-02 20:19:43.717800588 +0000 UTC m=+1319.018708083 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift") pod "swift-storage-0" (UID: "caaccc0b-6743-4907-9d87-f4ab26c931e2") : configmap "swift-ring-files" not found Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.418027 4807 generic.go:334] "Generic (PLEG): container finished" podID="3a9bd526-4162-4ed4-8d18-f7dba71eec1f" containerID="bbe035a07e060203525e8c15282e42ccc7c1178199d1ef52e3799ef95042bc15" exitCode=0 Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.418155 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-x5fkh" event={"ID":"3a9bd526-4162-4ed4-8d18-f7dba71eec1f","Type":"ContainerDied","Data":"bbe035a07e060203525e8c15282e42ccc7c1178199d1ef52e3799ef95042bc15"} Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.420766 4807 generic.go:334] "Generic (PLEG): container finished" podID="d8927ec0-1565-476c-9625-462a7d198c4e" containerID="2abdfb1d93c9081581c4ffc5d4699330cf0a2655b01fc364421929d2ead46544" exitCode=0 Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.420916 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2vb9g" event={"ID":"d8927ec0-1565-476c-9625-462a7d198c4e","Type":"ContainerDied","Data":"2abdfb1d93c9081581c4ffc5d4699330cf0a2655b01fc364421929d2ead46544"} Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.425163 4807 generic.go:334] "Generic (PLEG): container finished" podID="390cf7e1-da04-4176-aa78-71446bc7cef4" containerID="84a679e22fe45118cdd97bc864b3daf34e2764b9ce124d28838391ed9d8b13ef" exitCode=0 Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.425248 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-46a0-account-create-update-6nrzt" event={"ID":"390cf7e1-da04-4176-aa78-71446bc7cef4","Type":"ContainerDied","Data":"84a679e22fe45118cdd97bc864b3daf34e2764b9ce124d28838391ed9d8b13ef"} Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.433516 4807 generic.go:334] "Generic (PLEG): container finished" podID="09c59d51-437b-44fc-a75a-d36757cbb08a" containerID="e9630964cbde1ca10c955054abcc52a293ff06f9e399ab1398e81fa4c7d7c762" exitCode=0 Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.433603 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2wbvt" event={"ID":"09c59d51-437b-44fc-a75a-d36757cbb08a","Type":"ContainerDied","Data":"e9630964cbde1ca10c955054abcc52a293ff06f9e399ab1398e81fa4c7d7c762"} Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.438371 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"06539675-3505-4f57-bdfd-54ccdb96d90a","Type":"ContainerStarted","Data":"196ba313972be40e0dd11d5214bd0d63565309deff4e05241cde55c962268a3f"} Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.438547 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.441615 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-kstxp" podUID="80844aa2-667c-4b2d-a55a-e5fa2cd3dd85" containerName="ovn-controller" probeResult="failure" output=< Dec 02 20:19:28 crc kubenswrapper[4807]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 02 20:19:28 crc kubenswrapper[4807]: > Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.443967 4807 generic.go:334] "Generic (PLEG): container finished" podID="08b18fa4-ea02-46c3-86ea-10e33edde0c0" containerID="0c17d948283b8bb8e0d783d58af0728007697c56538bac87a741940c6a06d0b2" exitCode=0 Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.444027 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-6dkkn" event={"ID":"08b18fa4-ea02-46c3-86ea-10e33edde0c0","Type":"ContainerDied","Data":"0c17d948283b8bb8e0d783d58af0728007697c56538bac87a741940c6a06d0b2"} Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.445574 4807 generic.go:334] "Generic (PLEG): container finished" podID="6b603338-b25a-435a-b8a8-32b3aa1791c8" containerID="09b26364b38c93c98ab4dd264eac9bd59c3d1343ee873cda958c5e64c52890aa" exitCode=0 Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.445624 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-23da-account-create-update-6mqn7" event={"ID":"6b603338-b25a-435a-b8a8-32b3aa1791c8","Type":"ContainerDied","Data":"09b26364b38c93c98ab4dd264eac9bd59c3d1343ee873cda958c5e64c52890aa"} Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.448793 4807 generic.go:334] "Generic (PLEG): container finished" podID="30197c83-6070-4ace-b56a-cf82f143ffb5" containerID="651f490f87cfb537cb2f9e57d84f6548ef70d4778258b10a7e7998c5b7852171" exitCode=0 Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.448852 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-54fc-account-create-update-c9zhp" event={"ID":"30197c83-6070-4ace-b56a-cf82f143ffb5","Type":"ContainerDied","Data":"651f490f87cfb537cb2f9e57d84f6548ef70d4778258b10a7e7998c5b7852171"} Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.452595 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" event={"ID":"9e632286-c5c3-48a0-a79c-445edea3b864","Type":"ContainerStarted","Data":"66603f698908783c8443c16b6ab879e122ba6a4191f63cfc19f612f4335ef295"} Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.452705 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.460598 4807 generic.go:334] "Generic (PLEG): container finished" podID="5b4b20e6-d557-429a-b63f-8d51312805c9" containerID="d2557b6d0d009c3c22817ec1e240888b2382e6beb35d83125eb7aad25b7daa5e" exitCode=0 Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.461223 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3e9b-account-create-update-6hr9w" event={"ID":"5b4b20e6-d557-429a-b63f-8d51312805c9","Type":"ContainerDied","Data":"d2557b6d0d009c3c22817ec1e240888b2382e6beb35d83125eb7aad25b7daa5e"} Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.514680 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" podStartSLOduration=14.514659806000001 podStartE2EDuration="14.514659806s" podCreationTimestamp="2025-12-02 20:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:19:28.511380629 +0000 UTC m=+1303.812288144" watchObservedRunningTime="2025-12-02 20:19:28.514659806 +0000 UTC m=+1303.815567291" Dec 02 20:19:28 crc kubenswrapper[4807]: I1202 20:19:28.592145 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=4.152878394 podStartE2EDuration="14.592130064s" podCreationTimestamp="2025-12-02 20:19:14 +0000 UTC" firstStartedPulling="2025-12-02 20:19:15.827631056 +0000 UTC m=+1291.128538551" lastFinishedPulling="2025-12-02 20:19:26.266882726 +0000 UTC m=+1301.567790221" observedRunningTime="2025-12-02 20:19:28.589933169 +0000 UTC m=+1303.890840664" watchObservedRunningTime="2025-12-02 20:19:28.592130064 +0000 UTC m=+1303.893037559" Dec 02 20:19:29 crc kubenswrapper[4807]: I1202 20:19:29.864759 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2wbvt" Dec 02 20:19:29 crc kubenswrapper[4807]: I1202 20:19:29.921537 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-648s6\" (UniqueName: \"kubernetes.io/projected/09c59d51-437b-44fc-a75a-d36757cbb08a-kube-api-access-648s6\") pod \"09c59d51-437b-44fc-a75a-d36757cbb08a\" (UID: \"09c59d51-437b-44fc-a75a-d36757cbb08a\") " Dec 02 20:19:29 crc kubenswrapper[4807]: I1202 20:19:29.921653 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09c59d51-437b-44fc-a75a-d36757cbb08a-operator-scripts\") pod \"09c59d51-437b-44fc-a75a-d36757cbb08a\" (UID: \"09c59d51-437b-44fc-a75a-d36757cbb08a\") " Dec 02 20:19:29 crc kubenswrapper[4807]: I1202 20:19:29.923409 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09c59d51-437b-44fc-a75a-d36757cbb08a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "09c59d51-437b-44fc-a75a-d36757cbb08a" (UID: "09c59d51-437b-44fc-a75a-d36757cbb08a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:29 crc kubenswrapper[4807]: I1202 20:19:29.941701 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09c59d51-437b-44fc-a75a-d36757cbb08a-kube-api-access-648s6" (OuterVolumeSpecName: "kube-api-access-648s6") pod "09c59d51-437b-44fc-a75a-d36757cbb08a" (UID: "09c59d51-437b-44fc-a75a-d36757cbb08a"). InnerVolumeSpecName "kube-api-access-648s6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.024255 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-648s6\" (UniqueName: \"kubernetes.io/projected/09c59d51-437b-44fc-a75a-d36757cbb08a-kube-api-access-648s6\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.024297 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09c59d51-437b-44fc-a75a-d36757cbb08a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.158894 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-54fc-account-create-update-c9zhp" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.168817 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3e9b-account-create-update-6hr9w" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.204281 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2vb9g" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.212617 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-23da-account-create-update-6mqn7" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.218096 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-46a0-account-create-update-6nrzt" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.228559 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-6dkkn" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.228968 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b4b20e6-d557-429a-b63f-8d51312805c9-operator-scripts\") pod \"5b4b20e6-d557-429a-b63f-8d51312805c9\" (UID: \"5b4b20e6-d557-429a-b63f-8d51312805c9\") " Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.229015 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9n2n\" (UniqueName: \"kubernetes.io/projected/6b603338-b25a-435a-b8a8-32b3aa1791c8-kube-api-access-v9n2n\") pod \"6b603338-b25a-435a-b8a8-32b3aa1791c8\" (UID: \"6b603338-b25a-435a-b8a8-32b3aa1791c8\") " Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.229040 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjndd\" (UniqueName: \"kubernetes.io/projected/30197c83-6070-4ace-b56a-cf82f143ffb5-kube-api-access-hjndd\") pod \"30197c83-6070-4ace-b56a-cf82f143ffb5\" (UID: \"30197c83-6070-4ace-b56a-cf82f143ffb5\") " Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.229071 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wprqw\" (UniqueName: \"kubernetes.io/projected/d8927ec0-1565-476c-9625-462a7d198c4e-kube-api-access-wprqw\") pod \"d8927ec0-1565-476c-9625-462a7d198c4e\" (UID: \"d8927ec0-1565-476c-9625-462a7d198c4e\") " Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.229097 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/390cf7e1-da04-4176-aa78-71446bc7cef4-operator-scripts\") pod \"390cf7e1-da04-4176-aa78-71446bc7cef4\" (UID: \"390cf7e1-da04-4176-aa78-71446bc7cef4\") " Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.229127 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh7pv\" (UniqueName: \"kubernetes.io/projected/390cf7e1-da04-4176-aa78-71446bc7cef4-kube-api-access-rh7pv\") pod \"390cf7e1-da04-4176-aa78-71446bc7cef4\" (UID: \"390cf7e1-da04-4176-aa78-71446bc7cef4\") " Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.229162 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkwcw\" (UniqueName: \"kubernetes.io/projected/5b4b20e6-d557-429a-b63f-8d51312805c9-kube-api-access-gkwcw\") pod \"5b4b20e6-d557-429a-b63f-8d51312805c9\" (UID: \"5b4b20e6-d557-429a-b63f-8d51312805c9\") " Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.229497 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b4b20e6-d557-429a-b63f-8d51312805c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b4b20e6-d557-429a-b63f-8d51312805c9" (UID: "5b4b20e6-d557-429a-b63f-8d51312805c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.230134 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/390cf7e1-da04-4176-aa78-71446bc7cef4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "390cf7e1-da04-4176-aa78-71446bc7cef4" (UID: "390cf7e1-da04-4176-aa78-71446bc7cef4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.234059 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b4b20e6-d557-429a-b63f-8d51312805c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.234149 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/390cf7e1-da04-4176-aa78-71446bc7cef4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.236335 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8927ec0-1565-476c-9625-462a7d198c4e-kube-api-access-wprqw" (OuterVolumeSpecName: "kube-api-access-wprqw") pod "d8927ec0-1565-476c-9625-462a7d198c4e" (UID: "d8927ec0-1565-476c-9625-462a7d198c4e"). InnerVolumeSpecName "kube-api-access-wprqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.236366 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b4b20e6-d557-429a-b63f-8d51312805c9-kube-api-access-gkwcw" (OuterVolumeSpecName: "kube-api-access-gkwcw") pod "5b4b20e6-d557-429a-b63f-8d51312805c9" (UID: "5b4b20e6-d557-429a-b63f-8d51312805c9"). InnerVolumeSpecName "kube-api-access-gkwcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.236384 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b603338-b25a-435a-b8a8-32b3aa1791c8-kube-api-access-v9n2n" (OuterVolumeSpecName: "kube-api-access-v9n2n") pod "6b603338-b25a-435a-b8a8-32b3aa1791c8" (UID: "6b603338-b25a-435a-b8a8-32b3aa1791c8"). InnerVolumeSpecName "kube-api-access-v9n2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.236447 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/390cf7e1-da04-4176-aa78-71446bc7cef4-kube-api-access-rh7pv" (OuterVolumeSpecName: "kube-api-access-rh7pv") pod "390cf7e1-da04-4176-aa78-71446bc7cef4" (UID: "390cf7e1-da04-4176-aa78-71446bc7cef4"). InnerVolumeSpecName "kube-api-access-rh7pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.242981 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30197c83-6070-4ace-b56a-cf82f143ffb5-kube-api-access-hjndd" (OuterVolumeSpecName: "kube-api-access-hjndd") pod "30197c83-6070-4ace-b56a-cf82f143ffb5" (UID: "30197c83-6070-4ace-b56a-cf82f143ffb5"). InnerVolumeSpecName "kube-api-access-hjndd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.262547 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x5fkh" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.334985 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx442\" (UniqueName: \"kubernetes.io/projected/08b18fa4-ea02-46c3-86ea-10e33edde0c0-kube-api-access-cx442\") pod \"08b18fa4-ea02-46c3-86ea-10e33edde0c0\" (UID: \"08b18fa4-ea02-46c3-86ea-10e33edde0c0\") " Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.335034 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtwzp\" (UniqueName: \"kubernetes.io/projected/3a9bd526-4162-4ed4-8d18-f7dba71eec1f-kube-api-access-xtwzp\") pod \"3a9bd526-4162-4ed4-8d18-f7dba71eec1f\" (UID: \"3a9bd526-4162-4ed4-8d18-f7dba71eec1f\") " Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.335096 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b603338-b25a-435a-b8a8-32b3aa1791c8-operator-scripts\") pod \"6b603338-b25a-435a-b8a8-32b3aa1791c8\" (UID: \"6b603338-b25a-435a-b8a8-32b3aa1791c8\") " Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.335154 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8927ec0-1565-476c-9625-462a7d198c4e-operator-scripts\") pod \"d8927ec0-1565-476c-9625-462a7d198c4e\" (UID: \"d8927ec0-1565-476c-9625-462a7d198c4e\") " Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.335174 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08b18fa4-ea02-46c3-86ea-10e33edde0c0-operator-scripts\") pod \"08b18fa4-ea02-46c3-86ea-10e33edde0c0\" (UID: \"08b18fa4-ea02-46c3-86ea-10e33edde0c0\") " Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.335195 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30197c83-6070-4ace-b56a-cf82f143ffb5-operator-scripts\") pod \"30197c83-6070-4ace-b56a-cf82f143ffb5\" (UID: \"30197c83-6070-4ace-b56a-cf82f143ffb5\") " Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.335313 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a9bd526-4162-4ed4-8d18-f7dba71eec1f-operator-scripts\") pod \"3a9bd526-4162-4ed4-8d18-f7dba71eec1f\" (UID: \"3a9bd526-4162-4ed4-8d18-f7dba71eec1f\") " Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.335986 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9bd526-4162-4ed4-8d18-f7dba71eec1f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3a9bd526-4162-4ed4-8d18-f7dba71eec1f" (UID: "3a9bd526-4162-4ed4-8d18-f7dba71eec1f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.336659 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wprqw\" (UniqueName: \"kubernetes.io/projected/d8927ec0-1565-476c-9625-462a7d198c4e-kube-api-access-wprqw\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.336678 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh7pv\" (UniqueName: \"kubernetes.io/projected/390cf7e1-da04-4176-aa78-71446bc7cef4-kube-api-access-rh7pv\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.336689 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkwcw\" (UniqueName: \"kubernetes.io/projected/5b4b20e6-d557-429a-b63f-8d51312805c9-kube-api-access-gkwcw\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.336733 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9n2n\" (UniqueName: \"kubernetes.io/projected/6b603338-b25a-435a-b8a8-32b3aa1791c8-kube-api-access-v9n2n\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.336745 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjndd\" (UniqueName: \"kubernetes.io/projected/30197c83-6070-4ace-b56a-cf82f143ffb5-kube-api-access-hjndd\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.337103 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8927ec0-1565-476c-9625-462a7d198c4e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8927ec0-1565-476c-9625-462a7d198c4e" (UID: "d8927ec0-1565-476c-9625-462a7d198c4e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.338428 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30197c83-6070-4ace-b56a-cf82f143ffb5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30197c83-6070-4ace-b56a-cf82f143ffb5" (UID: "30197c83-6070-4ace-b56a-cf82f143ffb5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.338470 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b603338-b25a-435a-b8a8-32b3aa1791c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b603338-b25a-435a-b8a8-32b3aa1791c8" (UID: "6b603338-b25a-435a-b8a8-32b3aa1791c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.338519 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08b18fa4-ea02-46c3-86ea-10e33edde0c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "08b18fa4-ea02-46c3-86ea-10e33edde0c0" (UID: "08b18fa4-ea02-46c3-86ea-10e33edde0c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.339220 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08b18fa4-ea02-46c3-86ea-10e33edde0c0-kube-api-access-cx442" (OuterVolumeSpecName: "kube-api-access-cx442") pod "08b18fa4-ea02-46c3-86ea-10e33edde0c0" (UID: "08b18fa4-ea02-46c3-86ea-10e33edde0c0"). InnerVolumeSpecName "kube-api-access-cx442". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.340508 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a9bd526-4162-4ed4-8d18-f7dba71eec1f-kube-api-access-xtwzp" (OuterVolumeSpecName: "kube-api-access-xtwzp") pod "3a9bd526-4162-4ed4-8d18-f7dba71eec1f" (UID: "3a9bd526-4162-4ed4-8d18-f7dba71eec1f"). InnerVolumeSpecName "kube-api-access-xtwzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.438206 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b603338-b25a-435a-b8a8-32b3aa1791c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.438423 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8927ec0-1565-476c-9625-462a7d198c4e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.438520 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08b18fa4-ea02-46c3-86ea-10e33edde0c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.438579 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30197c83-6070-4ace-b56a-cf82f143ffb5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.438633 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3a9bd526-4162-4ed4-8d18-f7dba71eec1f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.438691 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx442\" (UniqueName: \"kubernetes.io/projected/08b18fa4-ea02-46c3-86ea-10e33edde0c0-kube-api-access-cx442\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.438772 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtwzp\" (UniqueName: \"kubernetes.io/projected/3a9bd526-4162-4ed4-8d18-f7dba71eec1f-kube-api-access-xtwzp\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.480007 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3e9b-account-create-update-6hr9w" event={"ID":"5b4b20e6-d557-429a-b63f-8d51312805c9","Type":"ContainerDied","Data":"2f4907c5a2d61cfefff14185ca7e96745f0a8f25d5a9c57ccdc701884a4f0d36"} Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.480062 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3e9b-account-create-update-6hr9w" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.480080 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f4907c5a2d61cfefff14185ca7e96745f0a8f25d5a9c57ccdc701884a4f0d36" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.481692 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2wbvt" event={"ID":"09c59d51-437b-44fc-a75a-d36757cbb08a","Type":"ContainerDied","Data":"fe914b422c537b12719dcbad005a5ceb635a1872eec8403f0dc60255397c21b4"} Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.481752 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe914b422c537b12719dcbad005a5ceb635a1872eec8403f0dc60255397c21b4" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.481792 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2wbvt" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.483293 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-x5fkh" event={"ID":"3a9bd526-4162-4ed4-8d18-f7dba71eec1f","Type":"ContainerDied","Data":"6a9a1d5b0ac3fc0e884dc4c2abb9d78ad227bf951512f56dae0ced18490accb9"} Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.483327 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-x5fkh" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.483331 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a9a1d5b0ac3fc0e884dc4c2abb9d78ad227bf951512f56dae0ced18490accb9" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.484684 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-6dkkn" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.484680 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-6dkkn" event={"ID":"08b18fa4-ea02-46c3-86ea-10e33edde0c0","Type":"ContainerDied","Data":"bba1b06808659b26e0d1aeceb77e7461f77abba77e624a7caca04582ce5a308f"} Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.484822 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bba1b06808659b26e0d1aeceb77e7461f77abba77e624a7caca04582ce5a308f" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.486403 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-23da-account-create-update-6mqn7" event={"ID":"6b603338-b25a-435a-b8a8-32b3aa1791c8","Type":"ContainerDied","Data":"8b74226e2c406fe3b0c62a31daf16045e251bd247eb3ea39ea622e8e01374373"} Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.486435 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b74226e2c406fe3b0c62a31daf16045e251bd247eb3ea39ea622e8e01374373" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.486414 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-23da-account-create-update-6mqn7" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.488128 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-54fc-account-create-update-c9zhp" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.488117 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-54fc-account-create-update-c9zhp" event={"ID":"30197c83-6070-4ace-b56a-cf82f143ffb5","Type":"ContainerDied","Data":"e3e1387d1c93ddf31bce91d338a8fcce635b2dd85b6670f8fcf6dba974fbb080"} Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.488170 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3e1387d1c93ddf31bce91d338a8fcce635b2dd85b6670f8fcf6dba974fbb080" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.490432 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89cb183e-cc2d-4fd9-90d8-212c434f0d06","Type":"ContainerStarted","Data":"2ab661196fc48364552502b9353ac7d3586d02103c1567c89687234efef326b6"} Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.492211 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2vb9g" event={"ID":"d8927ec0-1565-476c-9625-462a7d198c4e","Type":"ContainerDied","Data":"e49764c46213b68e310f1d227109f39e911df28c714ea425f01c3b8a29b16a1d"} Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.492235 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e49764c46213b68e310f1d227109f39e911df28c714ea425f01c3b8a29b16a1d" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.492295 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2vb9g" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.499862 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-46a0-account-create-update-6nrzt" event={"ID":"390cf7e1-da04-4176-aa78-71446bc7cef4","Type":"ContainerDied","Data":"29db8e6ab8005bc69fc3e770c5a5bb4801671e7fa4f2a328938876692b8c851e"} Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.499920 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29db8e6ab8005bc69fc3e770c5a5bb4801671e7fa4f2a328938876692b8c851e" Dec 02 20:19:30 crc kubenswrapper[4807]: I1202 20:19:30.500002 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-46a0-account-create-update-6nrzt" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.435502 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-kstxp" podUID="80844aa2-667c-4b2d-a55a-e5fa2cd3dd85" containerName="ovn-controller" probeResult="failure" output=< Dec 02 20:19:33 crc kubenswrapper[4807]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 02 20:19:33 crc kubenswrapper[4807]: > Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.460649 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.467233 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-pxxrz" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.697834 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-trkqq"] Dec 02 20:19:33 crc kubenswrapper[4807]: E1202 20:19:33.698226 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b603338-b25a-435a-b8a8-32b3aa1791c8" containerName="mariadb-account-create-update" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.698242 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b603338-b25a-435a-b8a8-32b3aa1791c8" containerName="mariadb-account-create-update" Dec 02 20:19:33 crc kubenswrapper[4807]: E1202 20:19:33.698255 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9bd526-4162-4ed4-8d18-f7dba71eec1f" containerName="mariadb-database-create" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.698262 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9bd526-4162-4ed4-8d18-f7dba71eec1f" containerName="mariadb-database-create" Dec 02 20:19:33 crc kubenswrapper[4807]: E1202 20:19:33.698271 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8927ec0-1565-476c-9625-462a7d198c4e" containerName="mariadb-database-create" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.698278 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8927ec0-1565-476c-9625-462a7d198c4e" containerName="mariadb-database-create" Dec 02 20:19:33 crc kubenswrapper[4807]: E1202 20:19:33.698287 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30197c83-6070-4ace-b56a-cf82f143ffb5" containerName="mariadb-account-create-update" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.698293 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="30197c83-6070-4ace-b56a-cf82f143ffb5" containerName="mariadb-account-create-update" Dec 02 20:19:33 crc kubenswrapper[4807]: E1202 20:19:33.698304 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390cf7e1-da04-4176-aa78-71446bc7cef4" containerName="mariadb-account-create-update" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.698310 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="390cf7e1-da04-4176-aa78-71446bc7cef4" containerName="mariadb-account-create-update" Dec 02 20:19:33 crc kubenswrapper[4807]: E1202 20:19:33.698326 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b4b20e6-d557-429a-b63f-8d51312805c9" containerName="mariadb-account-create-update" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.698333 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b4b20e6-d557-429a-b63f-8d51312805c9" containerName="mariadb-account-create-update" Dec 02 20:19:33 crc kubenswrapper[4807]: E1202 20:19:33.698350 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09c59d51-437b-44fc-a75a-d36757cbb08a" containerName="mariadb-database-create" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.698356 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="09c59d51-437b-44fc-a75a-d36757cbb08a" containerName="mariadb-database-create" Dec 02 20:19:33 crc kubenswrapper[4807]: E1202 20:19:33.698367 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08b18fa4-ea02-46c3-86ea-10e33edde0c0" containerName="mariadb-database-create" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.698372 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="08b18fa4-ea02-46c3-86ea-10e33edde0c0" containerName="mariadb-database-create" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.698537 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="08b18fa4-ea02-46c3-86ea-10e33edde0c0" containerName="mariadb-database-create" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.698559 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b4b20e6-d557-429a-b63f-8d51312805c9" containerName="mariadb-account-create-update" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.698574 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="30197c83-6070-4ace-b56a-cf82f143ffb5" containerName="mariadb-account-create-update" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.698589 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b603338-b25a-435a-b8a8-32b3aa1791c8" containerName="mariadb-account-create-update" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.698602 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8927ec0-1565-476c-9625-462a7d198c4e" containerName="mariadb-database-create" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.698617 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="09c59d51-437b-44fc-a75a-d36757cbb08a" containerName="mariadb-database-create" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.698628 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9bd526-4162-4ed4-8d18-f7dba71eec1f" containerName="mariadb-database-create" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.698639 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="390cf7e1-da04-4176-aa78-71446bc7cef4" containerName="mariadb-account-create-update" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.699252 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-trkqq" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.709224 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.709518 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jtsp2" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.711344 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-trkqq"] Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.723366 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kstxp-config-pbxb7"] Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.724793 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.730818 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.767334 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kstxp-config-pbxb7"] Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.817493 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94b2120d-4963-4606-b9fe-15569038b2bf-scripts\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.817558 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-config-data\") pod \"glance-db-sync-trkqq\" (UID: \"53ba8115-bc84-45ea-804d-c29add38ee3a\") " pod="openstack/glance-db-sync-trkqq" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.817586 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-run\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.817632 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t62b2\" (UniqueName: \"kubernetes.io/projected/53ba8115-bc84-45ea-804d-c29add38ee3a-kube-api-access-t62b2\") pod \"glance-db-sync-trkqq\" (UID: \"53ba8115-bc84-45ea-804d-c29add38ee3a\") " pod="openstack/glance-db-sync-trkqq" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.817698 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-log-ovn\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.817728 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/94b2120d-4963-4606-b9fe-15569038b2bf-additional-scripts\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.817790 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmps2\" (UniqueName: \"kubernetes.io/projected/94b2120d-4963-4606-b9fe-15569038b2bf-kube-api-access-jmps2\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.817823 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-combined-ca-bundle\") pod \"glance-db-sync-trkqq\" (UID: \"53ba8115-bc84-45ea-804d-c29add38ee3a\") " pod="openstack/glance-db-sync-trkqq" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.817854 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-run-ovn\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.817881 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-db-sync-config-data\") pod \"glance-db-sync-trkqq\" (UID: \"53ba8115-bc84-45ea-804d-c29add38ee3a\") " pod="openstack/glance-db-sync-trkqq" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.919915 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t62b2\" (UniqueName: \"kubernetes.io/projected/53ba8115-bc84-45ea-804d-c29add38ee3a-kube-api-access-t62b2\") pod \"glance-db-sync-trkqq\" (UID: \"53ba8115-bc84-45ea-804d-c29add38ee3a\") " pod="openstack/glance-db-sync-trkqq" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.920211 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-log-ovn\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.920403 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/94b2120d-4963-4606-b9fe-15569038b2bf-additional-scripts\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.920604 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-log-ovn\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.920611 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmps2\" (UniqueName: \"kubernetes.io/projected/94b2120d-4963-4606-b9fe-15569038b2bf-kube-api-access-jmps2\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.920821 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-combined-ca-bundle\") pod \"glance-db-sync-trkqq\" (UID: \"53ba8115-bc84-45ea-804d-c29add38ee3a\") " pod="openstack/glance-db-sync-trkqq" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.920899 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-run-ovn\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.920923 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-db-sync-config-data\") pod \"glance-db-sync-trkqq\" (UID: \"53ba8115-bc84-45ea-804d-c29add38ee3a\") " pod="openstack/glance-db-sync-trkqq" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.921048 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94b2120d-4963-4606-b9fe-15569038b2bf-scripts\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.921517 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-config-data\") pod \"glance-db-sync-trkqq\" (UID: \"53ba8115-bc84-45ea-804d-c29add38ee3a\") " pod="openstack/glance-db-sync-trkqq" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.921608 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-run\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.921225 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/94b2120d-4963-4606-b9fe-15569038b2bf-additional-scripts\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.921109 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-run-ovn\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.921833 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-run\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.923018 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94b2120d-4963-4606-b9fe-15569038b2bf-scripts\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.926474 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-combined-ca-bundle\") pod \"glance-db-sync-trkqq\" (UID: \"53ba8115-bc84-45ea-804d-c29add38ee3a\") " pod="openstack/glance-db-sync-trkqq" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.927076 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-config-data\") pod \"glance-db-sync-trkqq\" (UID: \"53ba8115-bc84-45ea-804d-c29add38ee3a\") " pod="openstack/glance-db-sync-trkqq" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.941608 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-db-sync-config-data\") pod \"glance-db-sync-trkqq\" (UID: \"53ba8115-bc84-45ea-804d-c29add38ee3a\") " pod="openstack/glance-db-sync-trkqq" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.950671 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmps2\" (UniqueName: \"kubernetes.io/projected/94b2120d-4963-4606-b9fe-15569038b2bf-kube-api-access-jmps2\") pod \"ovn-controller-kstxp-config-pbxb7\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:33 crc kubenswrapper[4807]: I1202 20:19:33.954254 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t62b2\" (UniqueName: \"kubernetes.io/projected/53ba8115-bc84-45ea-804d-c29add38ee3a-kube-api-access-t62b2\") pod \"glance-db-sync-trkqq\" (UID: \"53ba8115-bc84-45ea-804d-c29add38ee3a\") " pod="openstack/glance-db-sync-trkqq" Dec 02 20:19:34 crc kubenswrapper[4807]: I1202 20:19:34.047144 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-trkqq" Dec 02 20:19:34 crc kubenswrapper[4807]: I1202 20:19:34.063378 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:34 crc kubenswrapper[4807]: I1202 20:19:34.538981 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89cb183e-cc2d-4fd9-90d8-212c434f0d06","Type":"ContainerStarted","Data":"22e7dc6c074f0f2eaa1cd8563a89c09284ca142fd6144941b85127d45aac88d5"} Dec 02 20:19:34 crc kubenswrapper[4807]: I1202 20:19:34.567318 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=20.83765197 podStartE2EDuration="1m4.567298741s" podCreationTimestamp="2025-12-02 20:18:30 +0000 UTC" firstStartedPulling="2025-12-02 20:18:50.175928677 +0000 UTC m=+1265.476836172" lastFinishedPulling="2025-12-02 20:19:33.905575448 +0000 UTC m=+1309.206482943" observedRunningTime="2025-12-02 20:19:34.56022136 +0000 UTC m=+1309.861128865" watchObservedRunningTime="2025-12-02 20:19:34.567298741 +0000 UTC m=+1309.868206236" Dec 02 20:19:34 crc kubenswrapper[4807]: I1202 20:19:34.616562 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kstxp-config-pbxb7"] Dec 02 20:19:34 crc kubenswrapper[4807]: W1202 20:19:34.618960 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94b2120d_4963_4606_b9fe_15569038b2bf.slice/crio-44d4b30a2bb6903c6dbabeb40bf2a549ba4dd4590bed0f0d898eb08e22fe1629 WatchSource:0}: Error finding container 44d4b30a2bb6903c6dbabeb40bf2a549ba4dd4590bed0f0d898eb08e22fe1629: Status 404 returned error can't find the container with id 44d4b30a2bb6903c6dbabeb40bf2a549ba4dd4590bed0f0d898eb08e22fe1629 Dec 02 20:19:34 crc kubenswrapper[4807]: I1202 20:19:34.774642 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-trkqq"] Dec 02 20:19:34 crc kubenswrapper[4807]: I1202 20:19:34.775911 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:35 crc kubenswrapper[4807]: I1202 20:19:35.028125 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:19:35 crc kubenswrapper[4807]: I1202 20:19:35.100310 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-xkdjx"] Dec 02 20:19:35 crc kubenswrapper[4807]: E1202 20:19:35.507281 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3dd1529_ab40_486b_8458_e3d1afc9a0e2.slice/crio-71f6b872d77b94a0995c498bad5b0c3ad377f9739102d6f20bee41f32571bd52.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94b2120d_4963_4606_b9fe_15569038b2bf.slice/crio-conmon-783ee09a7ea28f99781fd2a5396867d821685cbfa7907f2db86f0446e40d3775.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3dd1529_ab40_486b_8458_e3d1afc9a0e2.slice/crio-conmon-71f6b872d77b94a0995c498bad5b0c3ad377f9739102d6f20bee41f32571bd52.scope\": RecentStats: unable to find data in memory cache]" Dec 02 20:19:35 crc kubenswrapper[4807]: I1202 20:19:35.549551 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-trkqq" event={"ID":"53ba8115-bc84-45ea-804d-c29add38ee3a","Type":"ContainerStarted","Data":"2dfb6444d84c6e4ad81c65cc59370fe3a41db0dbf816a0b9e658e484ca42f3ed"} Dec 02 20:19:35 crc kubenswrapper[4807]: I1202 20:19:35.553505 4807 generic.go:334] "Generic (PLEG): container finished" podID="a3dd1529-ab40-486b-8458-e3d1afc9a0e2" containerID="71f6b872d77b94a0995c498bad5b0c3ad377f9739102d6f20bee41f32571bd52" exitCode=0 Dec 02 20:19:35 crc kubenswrapper[4807]: I1202 20:19:35.553593 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-msbf5" event={"ID":"a3dd1529-ab40-486b-8458-e3d1afc9a0e2","Type":"ContainerDied","Data":"71f6b872d77b94a0995c498bad5b0c3ad377f9739102d6f20bee41f32571bd52"} Dec 02 20:19:35 crc kubenswrapper[4807]: I1202 20:19:35.557533 4807 generic.go:334] "Generic (PLEG): container finished" podID="94b2120d-4963-4606-b9fe-15569038b2bf" containerID="783ee09a7ea28f99781fd2a5396867d821685cbfa7907f2db86f0446e40d3775" exitCode=0 Dec 02 20:19:35 crc kubenswrapper[4807]: I1202 20:19:35.557602 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kstxp-config-pbxb7" event={"ID":"94b2120d-4963-4606-b9fe-15569038b2bf","Type":"ContainerDied","Data":"783ee09a7ea28f99781fd2a5396867d821685cbfa7907f2db86f0446e40d3775"} Dec 02 20:19:35 crc kubenswrapper[4807]: I1202 20:19:35.557640 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kstxp-config-pbxb7" event={"ID":"94b2120d-4963-4606-b9fe-15569038b2bf","Type":"ContainerStarted","Data":"44d4b30a2bb6903c6dbabeb40bf2a549ba4dd4590bed0f0d898eb08e22fe1629"} Dec 02 20:19:35 crc kubenswrapper[4807]: I1202 20:19:35.557896 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" podUID="688a394e-b6ac-45a4-8617-1b59bcb1d7b6" containerName="dnsmasq-dns" containerID="cri-o://d086eb7ac5b21eacea20e34602fca073925e7b8ecd5182415e4946da99f91ce8" gracePeriod=10 Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.158564 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.167034 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpkw6\" (UniqueName: \"kubernetes.io/projected/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-kube-api-access-qpkw6\") pod \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\" (UID: \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\") " Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.167113 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-dns-svc\") pod \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\" (UID: \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\") " Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.167182 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-ovsdbserver-sb\") pod \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\" (UID: \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\") " Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.167260 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-config\") pod \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\" (UID: \"688a394e-b6ac-45a4-8617-1b59bcb1d7b6\") " Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.187556 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-kube-api-access-qpkw6" (OuterVolumeSpecName: "kube-api-access-qpkw6") pod "688a394e-b6ac-45a4-8617-1b59bcb1d7b6" (UID: "688a394e-b6ac-45a4-8617-1b59bcb1d7b6"). InnerVolumeSpecName "kube-api-access-qpkw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.241383 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "688a394e-b6ac-45a4-8617-1b59bcb1d7b6" (UID: "688a394e-b6ac-45a4-8617-1b59bcb1d7b6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.247458 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "688a394e-b6ac-45a4-8617-1b59bcb1d7b6" (UID: "688a394e-b6ac-45a4-8617-1b59bcb1d7b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.259854 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-config" (OuterVolumeSpecName: "config") pod "688a394e-b6ac-45a4-8617-1b59bcb1d7b6" (UID: "688a394e-b6ac-45a4-8617-1b59bcb1d7b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.269635 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.269681 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpkw6\" (UniqueName: \"kubernetes.io/projected/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-kube-api-access-qpkw6\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.269693 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.269704 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/688a394e-b6ac-45a4-8617-1b59bcb1d7b6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.391856 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.593371 4807 generic.go:334] "Generic (PLEG): container finished" podID="688a394e-b6ac-45a4-8617-1b59bcb1d7b6" containerID="d086eb7ac5b21eacea20e34602fca073925e7b8ecd5182415e4946da99f91ce8" exitCode=0 Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.594515 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.594872 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" event={"ID":"688a394e-b6ac-45a4-8617-1b59bcb1d7b6","Type":"ContainerDied","Data":"d086eb7ac5b21eacea20e34602fca073925e7b8ecd5182415e4946da99f91ce8"} Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.594911 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8cc7fc4dc-xkdjx" event={"ID":"688a394e-b6ac-45a4-8617-1b59bcb1d7b6","Type":"ContainerDied","Data":"d890c45f54a719e9902a2bd5c8a4a10d2bce1c40e4a3766ed9df490e084fde7a"} Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.594935 4807 scope.go:117] "RemoveContainer" containerID="d086eb7ac5b21eacea20e34602fca073925e7b8ecd5182415e4946da99f91ce8" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.665835 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-xkdjx"] Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.673765 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8cc7fc4dc-xkdjx"] Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.685903 4807 scope.go:117] "RemoveContainer" containerID="1c7d641f183cba87a8f1bf64128f0b5d2c35ce3c74ffd94b0ad626df5f1a5b7d" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.755219 4807 scope.go:117] "RemoveContainer" containerID="d086eb7ac5b21eacea20e34602fca073925e7b8ecd5182415e4946da99f91ce8" Dec 02 20:19:36 crc kubenswrapper[4807]: E1202 20:19:36.755683 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d086eb7ac5b21eacea20e34602fca073925e7b8ecd5182415e4946da99f91ce8\": container with ID starting with d086eb7ac5b21eacea20e34602fca073925e7b8ecd5182415e4946da99f91ce8 not found: ID does not exist" containerID="d086eb7ac5b21eacea20e34602fca073925e7b8ecd5182415e4946da99f91ce8" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.755730 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d086eb7ac5b21eacea20e34602fca073925e7b8ecd5182415e4946da99f91ce8"} err="failed to get container status \"d086eb7ac5b21eacea20e34602fca073925e7b8ecd5182415e4946da99f91ce8\": rpc error: code = NotFound desc = could not find container \"d086eb7ac5b21eacea20e34602fca073925e7b8ecd5182415e4946da99f91ce8\": container with ID starting with d086eb7ac5b21eacea20e34602fca073925e7b8ecd5182415e4946da99f91ce8 not found: ID does not exist" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.755750 4807 scope.go:117] "RemoveContainer" containerID="1c7d641f183cba87a8f1bf64128f0b5d2c35ce3c74ffd94b0ad626df5f1a5b7d" Dec 02 20:19:36 crc kubenswrapper[4807]: E1202 20:19:36.758735 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c7d641f183cba87a8f1bf64128f0b5d2c35ce3c74ffd94b0ad626df5f1a5b7d\": container with ID starting with 1c7d641f183cba87a8f1bf64128f0b5d2c35ce3c74ffd94b0ad626df5f1a5b7d not found: ID does not exist" containerID="1c7d641f183cba87a8f1bf64128f0b5d2c35ce3c74ffd94b0ad626df5f1a5b7d" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.758763 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c7d641f183cba87a8f1bf64128f0b5d2c35ce3c74ffd94b0ad626df5f1a5b7d"} err="failed to get container status \"1c7d641f183cba87a8f1bf64128f0b5d2c35ce3c74ffd94b0ad626df5f1a5b7d\": rpc error: code = NotFound desc = could not find container \"1c7d641f183cba87a8f1bf64128f0b5d2c35ce3c74ffd94b0ad626df5f1a5b7d\": container with ID starting with 1c7d641f183cba87a8f1bf64128f0b5d2c35ce3c74ffd94b0ad626df5f1a5b7d not found: ID does not exist" Dec 02 20:19:36 crc kubenswrapper[4807]: I1202 20:19:36.983754 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="688a394e-b6ac-45a4-8617-1b59bcb1d7b6" path="/var/lib/kubelet/pods/688a394e-b6ac-45a4-8617-1b59bcb1d7b6/volumes" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.032885 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.142294 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.188093 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-combined-ca-bundle\") pod \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.188539 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-etc-swift\") pod \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.188590 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-ring-data-devices\") pod \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.188617 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-scripts\") pod \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.188666 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-dispersionconf\") pod \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.188713 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9bnh\" (UniqueName: \"kubernetes.io/projected/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-kube-api-access-f9bnh\") pod \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.188781 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-swiftconf\") pod \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\" (UID: \"a3dd1529-ab40-486b-8458-e3d1afc9a0e2\") " Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.189252 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a3dd1529-ab40-486b-8458-e3d1afc9a0e2" (UID: "a3dd1529-ab40-486b-8458-e3d1afc9a0e2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.189388 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a3dd1529-ab40-486b-8458-e3d1afc9a0e2" (UID: "a3dd1529-ab40-486b-8458-e3d1afc9a0e2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.192453 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-kube-api-access-f9bnh" (OuterVolumeSpecName: "kube-api-access-f9bnh") pod "a3dd1529-ab40-486b-8458-e3d1afc9a0e2" (UID: "a3dd1529-ab40-486b-8458-e3d1afc9a0e2"). InnerVolumeSpecName "kube-api-access-f9bnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.196647 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a3dd1529-ab40-486b-8458-e3d1afc9a0e2" (UID: "a3dd1529-ab40-486b-8458-e3d1afc9a0e2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.210819 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-scripts" (OuterVolumeSpecName: "scripts") pod "a3dd1529-ab40-486b-8458-e3d1afc9a0e2" (UID: "a3dd1529-ab40-486b-8458-e3d1afc9a0e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.217709 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3dd1529-ab40-486b-8458-e3d1afc9a0e2" (UID: "a3dd1529-ab40-486b-8458-e3d1afc9a0e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.219662 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a3dd1529-ab40-486b-8458-e3d1afc9a0e2" (UID: "a3dd1529-ab40-486b-8458-e3d1afc9a0e2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.290953 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94b2120d-4963-4606-b9fe-15569038b2bf-scripts\") pod \"94b2120d-4963-4606-b9fe-15569038b2bf\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.291166 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-run\") pod \"94b2120d-4963-4606-b9fe-15569038b2bf\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.291244 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmps2\" (UniqueName: \"kubernetes.io/projected/94b2120d-4963-4606-b9fe-15569038b2bf-kube-api-access-jmps2\") pod \"94b2120d-4963-4606-b9fe-15569038b2bf\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.291374 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-log-ovn\") pod \"94b2120d-4963-4606-b9fe-15569038b2bf\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.291378 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-run" (OuterVolumeSpecName: "var-run") pod "94b2120d-4963-4606-b9fe-15569038b2bf" (UID: "94b2120d-4963-4606-b9fe-15569038b2bf"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.291434 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-run-ovn\") pod \"94b2120d-4963-4606-b9fe-15569038b2bf\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.291639 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/94b2120d-4963-4606-b9fe-15569038b2bf-additional-scripts\") pod \"94b2120d-4963-4606-b9fe-15569038b2bf\" (UID: \"94b2120d-4963-4606-b9fe-15569038b2bf\") " Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.292194 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "94b2120d-4963-4606-b9fe-15569038b2bf" (UID: "94b2120d-4963-4606-b9fe-15569038b2bf"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.292542 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94b2120d-4963-4606-b9fe-15569038b2bf-scripts" (OuterVolumeSpecName: "scripts") pod "94b2120d-4963-4606-b9fe-15569038b2bf" (UID: "94b2120d-4963-4606-b9fe-15569038b2bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.292603 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "94b2120d-4963-4606-b9fe-15569038b2bf" (UID: "94b2120d-4963-4606-b9fe-15569038b2bf"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.292759 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94b2120d-4963-4606-b9fe-15569038b2bf-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "94b2120d-4963-4606-b9fe-15569038b2bf" (UID: "94b2120d-4963-4606-b9fe-15569038b2bf"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.295191 4807 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-run\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.295223 4807 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.295238 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.295252 4807 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.295266 4807 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/94b2120d-4963-4606-b9fe-15569038b2bf-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.295278 4807 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.295289 4807 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.295302 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.295315 4807 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/94b2120d-4963-4606-b9fe-15569038b2bf-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.295326 4807 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.295338 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9bnh\" (UniqueName: \"kubernetes.io/projected/a3dd1529-ab40-486b-8458-e3d1afc9a0e2-kube-api-access-f9bnh\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.295353 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94b2120d-4963-4606-b9fe-15569038b2bf-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.302368 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b2120d-4963-4606-b9fe-15569038b2bf-kube-api-access-jmps2" (OuterVolumeSpecName: "kube-api-access-jmps2") pod "94b2120d-4963-4606-b9fe-15569038b2bf" (UID: "94b2120d-4963-4606-b9fe-15569038b2bf"). InnerVolumeSpecName "kube-api-access-jmps2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.397237 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmps2\" (UniqueName: \"kubernetes.io/projected/94b2120d-4963-4606-b9fe-15569038b2bf-kube-api-access-jmps2\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.605412 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-msbf5" event={"ID":"a3dd1529-ab40-486b-8458-e3d1afc9a0e2","Type":"ContainerDied","Data":"d3be9e7893e6b2852ac72c6275d412b24a9fb782ad6883d9fa65623ac3c0a5bb"} Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.605460 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3be9e7893e6b2852ac72c6275d412b24a9fb782ad6883d9fa65623ac3c0a5bb" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.605550 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-msbf5" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.625065 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kstxp-config-pbxb7" event={"ID":"94b2120d-4963-4606-b9fe-15569038b2bf","Type":"ContainerDied","Data":"44d4b30a2bb6903c6dbabeb40bf2a549ba4dd4590bed0f0d898eb08e22fe1629"} Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.625108 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44d4b30a2bb6903c6dbabeb40bf2a549ba4dd4590bed0f0d898eb08e22fe1629" Dec 02 20:19:37 crc kubenswrapper[4807]: I1202 20:19:37.625162 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kstxp-config-pbxb7" Dec 02 20:19:38 crc kubenswrapper[4807]: I1202 20:19:38.247512 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-kstxp-config-pbxb7"] Dec 02 20:19:38 crc kubenswrapper[4807]: I1202 20:19:38.254584 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-kstxp-config-pbxb7"] Dec 02 20:19:38 crc kubenswrapper[4807]: I1202 20:19:38.426190 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-kstxp" Dec 02 20:19:38 crc kubenswrapper[4807]: I1202 20:19:38.983734 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b2120d-4963-4606-b9fe-15569038b2bf" path="/var/lib/kubelet/pods/94b2120d-4963-4606-b9fe-15569038b2bf/volumes" Dec 02 20:19:40 crc kubenswrapper[4807]: I1202 20:19:40.146867 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 02 20:19:43 crc kubenswrapper[4807]: I1202 20:19:43.738659 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:43 crc kubenswrapper[4807]: I1202 20:19:43.759596 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/caaccc0b-6743-4907-9d87-f4ab26c931e2-etc-swift\") pod \"swift-storage-0\" (UID: \"caaccc0b-6743-4907-9d87-f4ab26c931e2\") " pod="openstack/swift-storage-0" Dec 02 20:19:43 crc kubenswrapper[4807]: I1202 20:19:43.930812 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.201077 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.238011 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.604314 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-8r86t"] Dec 02 20:19:45 crc kubenswrapper[4807]: E1202 20:19:45.605055 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688a394e-b6ac-45a4-8617-1b59bcb1d7b6" containerName="dnsmasq-dns" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.605075 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="688a394e-b6ac-45a4-8617-1b59bcb1d7b6" containerName="dnsmasq-dns" Dec 02 20:19:45 crc kubenswrapper[4807]: E1202 20:19:45.605091 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dd1529-ab40-486b-8458-e3d1afc9a0e2" containerName="swift-ring-rebalance" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.605099 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dd1529-ab40-486b-8458-e3d1afc9a0e2" containerName="swift-ring-rebalance" Dec 02 20:19:45 crc kubenswrapper[4807]: E1202 20:19:45.605110 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b2120d-4963-4606-b9fe-15569038b2bf" containerName="ovn-config" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.605116 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b2120d-4963-4606-b9fe-15569038b2bf" containerName="ovn-config" Dec 02 20:19:45 crc kubenswrapper[4807]: E1202 20:19:45.605129 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688a394e-b6ac-45a4-8617-1b59bcb1d7b6" containerName="init" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.605135 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="688a394e-b6ac-45a4-8617-1b59bcb1d7b6" containerName="init" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.605300 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="688a394e-b6ac-45a4-8617-1b59bcb1d7b6" containerName="dnsmasq-dns" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.605315 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b2120d-4963-4606-b9fe-15569038b2bf" containerName="ovn-config" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.605332 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3dd1529-ab40-486b-8458-e3d1afc9a0e2" containerName="swift-ring-rebalance" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.605938 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8r86t" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.637241 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8r86t"] Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.662824 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-95a6-account-create-update-5dpp4"] Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.664121 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-95a6-account-create-update-5dpp4" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.670779 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.693507 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzr8c\" (UniqueName: \"kubernetes.io/projected/38f1e4f9-4739-44a2-9257-389f73e77dfa-kube-api-access-fzr8c\") pod \"barbican-95a6-account-create-update-5dpp4\" (UID: \"38f1e4f9-4739-44a2-9257-389f73e77dfa\") " pod="openstack/barbican-95a6-account-create-update-5dpp4" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.693558 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v48s\" (UniqueName: \"kubernetes.io/projected/f9b198e4-d58c-4c1e-b6eb-35ecdeb30773-kube-api-access-5v48s\") pod \"cinder-db-create-8r86t\" (UID: \"f9b198e4-d58c-4c1e-b6eb-35ecdeb30773\") " pod="openstack/cinder-db-create-8r86t" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.693609 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38f1e4f9-4739-44a2-9257-389f73e77dfa-operator-scripts\") pod \"barbican-95a6-account-create-update-5dpp4\" (UID: \"38f1e4f9-4739-44a2-9257-389f73e77dfa\") " pod="openstack/barbican-95a6-account-create-update-5dpp4" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.693633 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9b198e4-d58c-4c1e-b6eb-35ecdeb30773-operator-scripts\") pod \"cinder-db-create-8r86t\" (UID: \"f9b198e4-d58c-4c1e-b6eb-35ecdeb30773\") " pod="openstack/cinder-db-create-8r86t" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.716510 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-95a6-account-create-update-5dpp4"] Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.737600 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-vsnlc"] Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.739330 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-vsnlc" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.744282 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-5lgzc" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.744488 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.753865 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-vsnlc"] Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.796287 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38f1e4f9-4739-44a2-9257-389f73e77dfa-operator-scripts\") pod \"barbican-95a6-account-create-update-5dpp4\" (UID: \"38f1e4f9-4739-44a2-9257-389f73e77dfa\") " pod="openstack/barbican-95a6-account-create-update-5dpp4" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.796345 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9b198e4-d58c-4c1e-b6eb-35ecdeb30773-operator-scripts\") pod \"cinder-db-create-8r86t\" (UID: \"f9b198e4-d58c-4c1e-b6eb-35ecdeb30773\") " pod="openstack/cinder-db-create-8r86t" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.796408 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-combined-ca-bundle\") pod \"watcher-db-sync-vsnlc\" (UID: \"5e35946b-d565-4354-9c86-8eb06b4ed154\") " pod="openstack/watcher-db-sync-vsnlc" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.796479 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzr8c\" (UniqueName: \"kubernetes.io/projected/38f1e4f9-4739-44a2-9257-389f73e77dfa-kube-api-access-fzr8c\") pod \"barbican-95a6-account-create-update-5dpp4\" (UID: \"38f1e4f9-4739-44a2-9257-389f73e77dfa\") " pod="openstack/barbican-95a6-account-create-update-5dpp4" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.796510 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v48s\" (UniqueName: \"kubernetes.io/projected/f9b198e4-d58c-4c1e-b6eb-35ecdeb30773-kube-api-access-5v48s\") pod \"cinder-db-create-8r86t\" (UID: \"f9b198e4-d58c-4c1e-b6eb-35ecdeb30773\") " pod="openstack/cinder-db-create-8r86t" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.796528 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-db-sync-config-data\") pod \"watcher-db-sync-vsnlc\" (UID: \"5e35946b-d565-4354-9c86-8eb06b4ed154\") " pod="openstack/watcher-db-sync-vsnlc" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.796557 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6pfp\" (UniqueName: \"kubernetes.io/projected/5e35946b-d565-4354-9c86-8eb06b4ed154-kube-api-access-b6pfp\") pod \"watcher-db-sync-vsnlc\" (UID: \"5e35946b-d565-4354-9c86-8eb06b4ed154\") " pod="openstack/watcher-db-sync-vsnlc" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.796584 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-config-data\") pod \"watcher-db-sync-vsnlc\" (UID: \"5e35946b-d565-4354-9c86-8eb06b4ed154\") " pod="openstack/watcher-db-sync-vsnlc" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.797584 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38f1e4f9-4739-44a2-9257-389f73e77dfa-operator-scripts\") pod \"barbican-95a6-account-create-update-5dpp4\" (UID: \"38f1e4f9-4739-44a2-9257-389f73e77dfa\") " pod="openstack/barbican-95a6-account-create-update-5dpp4" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.798371 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9b198e4-d58c-4c1e-b6eb-35ecdeb30773-operator-scripts\") pod \"cinder-db-create-8r86t\" (UID: \"f9b198e4-d58c-4c1e-b6eb-35ecdeb30773\") " pod="openstack/cinder-db-create-8r86t" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.815955 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f2a8-account-create-update-7v4b4"] Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.817134 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f2a8-account-create-update-7v4b4" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.821447 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.822462 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzr8c\" (UniqueName: \"kubernetes.io/projected/38f1e4f9-4739-44a2-9257-389f73e77dfa-kube-api-access-fzr8c\") pod \"barbican-95a6-account-create-update-5dpp4\" (UID: \"38f1e4f9-4739-44a2-9257-389f73e77dfa\") " pod="openstack/barbican-95a6-account-create-update-5dpp4" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.822995 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v48s\" (UniqueName: \"kubernetes.io/projected/f9b198e4-d58c-4c1e-b6eb-35ecdeb30773-kube-api-access-5v48s\") pod \"cinder-db-create-8r86t\" (UID: \"f9b198e4-d58c-4c1e-b6eb-35ecdeb30773\") " pod="openstack/cinder-db-create-8r86t" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.839309 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-wmhlb"] Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.841013 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wmhlb" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.847355 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f2a8-account-create-update-7v4b4"] Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.856784 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wmhlb"] Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.926838 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-db-sync-config-data\") pod \"watcher-db-sync-vsnlc\" (UID: \"5e35946b-d565-4354-9c86-8eb06b4ed154\") " pod="openstack/watcher-db-sync-vsnlc" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.927337 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6pfp\" (UniqueName: \"kubernetes.io/projected/5e35946b-d565-4354-9c86-8eb06b4ed154-kube-api-access-b6pfp\") pod \"watcher-db-sync-vsnlc\" (UID: \"5e35946b-d565-4354-9c86-8eb06b4ed154\") " pod="openstack/watcher-db-sync-vsnlc" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.927414 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-config-data\") pod \"watcher-db-sync-vsnlc\" (UID: \"5e35946b-d565-4354-9c86-8eb06b4ed154\") " pod="openstack/watcher-db-sync-vsnlc" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.927477 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/323b935c-8f72-4514-96c0-6edea05c6498-operator-scripts\") pod \"barbican-db-create-wmhlb\" (UID: \"323b935c-8f72-4514-96c0-6edea05c6498\") " pod="openstack/barbican-db-create-wmhlb" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.927540 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wr7c\" (UniqueName: \"kubernetes.io/projected/1141d1f9-6d7f-46fa-8e85-511b8a0adddf-kube-api-access-5wr7c\") pod \"cinder-f2a8-account-create-update-7v4b4\" (UID: \"1141d1f9-6d7f-46fa-8e85-511b8a0adddf\") " pod="openstack/cinder-f2a8-account-create-update-7v4b4" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.927655 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-combined-ca-bundle\") pod \"watcher-db-sync-vsnlc\" (UID: \"5e35946b-d565-4354-9c86-8eb06b4ed154\") " pod="openstack/watcher-db-sync-vsnlc" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.927688 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkdk2\" (UniqueName: \"kubernetes.io/projected/323b935c-8f72-4514-96c0-6edea05c6498-kube-api-access-pkdk2\") pod \"barbican-db-create-wmhlb\" (UID: \"323b935c-8f72-4514-96c0-6edea05c6498\") " pod="openstack/barbican-db-create-wmhlb" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.927866 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1141d1f9-6d7f-46fa-8e85-511b8a0adddf-operator-scripts\") pod \"cinder-f2a8-account-create-update-7v4b4\" (UID: \"1141d1f9-6d7f-46fa-8e85-511b8a0adddf\") " pod="openstack/cinder-f2a8-account-create-update-7v4b4" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.932504 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-config-data\") pod \"watcher-db-sync-vsnlc\" (UID: \"5e35946b-d565-4354-9c86-8eb06b4ed154\") " pod="openstack/watcher-db-sync-vsnlc" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.933211 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-combined-ca-bundle\") pod \"watcher-db-sync-vsnlc\" (UID: \"5e35946b-d565-4354-9c86-8eb06b4ed154\") " pod="openstack/watcher-db-sync-vsnlc" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.946265 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8r86t" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.950672 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-db-sync-config-data\") pod \"watcher-db-sync-vsnlc\" (UID: \"5e35946b-d565-4354-9c86-8eb06b4ed154\") " pod="openstack/watcher-db-sync-vsnlc" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.970491 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-qr6rh"] Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.971558 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6pfp\" (UniqueName: \"kubernetes.io/projected/5e35946b-d565-4354-9c86-8eb06b4ed154-kube-api-access-b6pfp\") pod \"watcher-db-sync-vsnlc\" (UID: \"5e35946b-d565-4354-9c86-8eb06b4ed154\") " pod="openstack/watcher-db-sync-vsnlc" Dec 02 20:19:45 crc kubenswrapper[4807]: I1202 20:19:45.993752 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qr6rh" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.003811 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qr6rh"] Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.012903 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-95a6-account-create-update-5dpp4" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.024628 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6097-account-create-update-jg6j9"] Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.027575 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6097-account-create-update-jg6j9" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.029393 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/323b935c-8f72-4514-96c0-6edea05c6498-operator-scripts\") pod \"barbican-db-create-wmhlb\" (UID: \"323b935c-8f72-4514-96c0-6edea05c6498\") " pod="openstack/barbican-db-create-wmhlb" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.029439 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wr7c\" (UniqueName: \"kubernetes.io/projected/1141d1f9-6d7f-46fa-8e85-511b8a0adddf-kube-api-access-5wr7c\") pod \"cinder-f2a8-account-create-update-7v4b4\" (UID: \"1141d1f9-6d7f-46fa-8e85-511b8a0adddf\") " pod="openstack/cinder-f2a8-account-create-update-7v4b4" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.029504 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkdk2\" (UniqueName: \"kubernetes.io/projected/323b935c-8f72-4514-96c0-6edea05c6498-kube-api-access-pkdk2\") pod \"barbican-db-create-wmhlb\" (UID: \"323b935c-8f72-4514-96c0-6edea05c6498\") " pod="openstack/barbican-db-create-wmhlb" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.029537 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa51800c-1f7b-4000-8547-c391f3a6dc6a-operator-scripts\") pod \"neutron-db-create-qr6rh\" (UID: \"aa51800c-1f7b-4000-8547-c391f3a6dc6a\") " pod="openstack/neutron-db-create-qr6rh" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.029577 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1141d1f9-6d7f-46fa-8e85-511b8a0adddf-operator-scripts\") pod \"cinder-f2a8-account-create-update-7v4b4\" (UID: \"1141d1f9-6d7f-46fa-8e85-511b8a0adddf\") " pod="openstack/cinder-f2a8-account-create-update-7v4b4" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.029610 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v868\" (UniqueName: \"kubernetes.io/projected/aa51800c-1f7b-4000-8547-c391f3a6dc6a-kube-api-access-9v868\") pod \"neutron-db-create-qr6rh\" (UID: \"aa51800c-1f7b-4000-8547-c391f3a6dc6a\") " pod="openstack/neutron-db-create-qr6rh" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.030102 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.031549 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1141d1f9-6d7f-46fa-8e85-511b8a0adddf-operator-scripts\") pod \"cinder-f2a8-account-create-update-7v4b4\" (UID: \"1141d1f9-6d7f-46fa-8e85-511b8a0adddf\") " pod="openstack/cinder-f2a8-account-create-update-7v4b4" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.032517 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/323b935c-8f72-4514-96c0-6edea05c6498-operator-scripts\") pod \"barbican-db-create-wmhlb\" (UID: \"323b935c-8f72-4514-96c0-6edea05c6498\") " pod="openstack/barbican-db-create-wmhlb" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.065934 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-vsnlc" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.070151 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wr7c\" (UniqueName: \"kubernetes.io/projected/1141d1f9-6d7f-46fa-8e85-511b8a0adddf-kube-api-access-5wr7c\") pod \"cinder-f2a8-account-create-update-7v4b4\" (UID: \"1141d1f9-6d7f-46fa-8e85-511b8a0adddf\") " pod="openstack/cinder-f2a8-account-create-update-7v4b4" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.086408 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6097-account-create-update-jg6j9"] Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.092150 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkdk2\" (UniqueName: \"kubernetes.io/projected/323b935c-8f72-4514-96c0-6edea05c6498-kube-api-access-pkdk2\") pod \"barbican-db-create-wmhlb\" (UID: \"323b935c-8f72-4514-96c0-6edea05c6498\") " pod="openstack/barbican-db-create-wmhlb" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.134972 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4226x\" (UniqueName: \"kubernetes.io/projected/e6b0a41b-3334-4181-bad4-fd5cf030b010-kube-api-access-4226x\") pod \"neutron-6097-account-create-update-jg6j9\" (UID: \"e6b0a41b-3334-4181-bad4-fd5cf030b010\") " pod="openstack/neutron-6097-account-create-update-jg6j9" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.135030 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa51800c-1f7b-4000-8547-c391f3a6dc6a-operator-scripts\") pod \"neutron-db-create-qr6rh\" (UID: \"aa51800c-1f7b-4000-8547-c391f3a6dc6a\") " pod="openstack/neutron-db-create-qr6rh" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.135065 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b0a41b-3334-4181-bad4-fd5cf030b010-operator-scripts\") pod \"neutron-6097-account-create-update-jg6j9\" (UID: \"e6b0a41b-3334-4181-bad4-fd5cf030b010\") " pod="openstack/neutron-6097-account-create-update-jg6j9" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.135098 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v868\" (UniqueName: \"kubernetes.io/projected/aa51800c-1f7b-4000-8547-c391f3a6dc6a-kube-api-access-9v868\") pod \"neutron-db-create-qr6rh\" (UID: \"aa51800c-1f7b-4000-8547-c391f3a6dc6a\") " pod="openstack/neutron-db-create-qr6rh" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.136177 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa51800c-1f7b-4000-8547-c391f3a6dc6a-operator-scripts\") pod \"neutron-db-create-qr6rh\" (UID: \"aa51800c-1f7b-4000-8547-c391f3a6dc6a\") " pod="openstack/neutron-db-create-qr6rh" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.141755 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-9qc25"] Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.143063 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9qc25" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.147804 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.148105 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.148247 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.152941 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-psn74" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.156801 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9qc25"] Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.158507 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v868\" (UniqueName: \"kubernetes.io/projected/aa51800c-1f7b-4000-8547-c391f3a6dc6a-kube-api-access-9v868\") pod \"neutron-db-create-qr6rh\" (UID: \"aa51800c-1f7b-4000-8547-c391f3a6dc6a\") " pod="openstack/neutron-db-create-qr6rh" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.187296 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f2a8-account-create-update-7v4b4" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.210478 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wmhlb" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.236965 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103c33fd-ac95-4efe-b1c2-b6c9187eb613-combined-ca-bundle\") pod \"keystone-db-sync-9qc25\" (UID: \"103c33fd-ac95-4efe-b1c2-b6c9187eb613\") " pod="openstack/keystone-db-sync-9qc25" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.237090 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4226x\" (UniqueName: \"kubernetes.io/projected/e6b0a41b-3334-4181-bad4-fd5cf030b010-kube-api-access-4226x\") pod \"neutron-6097-account-create-update-jg6j9\" (UID: \"e6b0a41b-3334-4181-bad4-fd5cf030b010\") " pod="openstack/neutron-6097-account-create-update-jg6j9" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.237145 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b0a41b-3334-4181-bad4-fd5cf030b010-operator-scripts\") pod \"neutron-6097-account-create-update-jg6j9\" (UID: \"e6b0a41b-3334-4181-bad4-fd5cf030b010\") " pod="openstack/neutron-6097-account-create-update-jg6j9" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.237167 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkk29\" (UniqueName: \"kubernetes.io/projected/103c33fd-ac95-4efe-b1c2-b6c9187eb613-kube-api-access-xkk29\") pod \"keystone-db-sync-9qc25\" (UID: \"103c33fd-ac95-4efe-b1c2-b6c9187eb613\") " pod="openstack/keystone-db-sync-9qc25" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.237205 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103c33fd-ac95-4efe-b1c2-b6c9187eb613-config-data\") pod \"keystone-db-sync-9qc25\" (UID: \"103c33fd-ac95-4efe-b1c2-b6c9187eb613\") " pod="openstack/keystone-db-sync-9qc25" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.238253 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b0a41b-3334-4181-bad4-fd5cf030b010-operator-scripts\") pod \"neutron-6097-account-create-update-jg6j9\" (UID: \"e6b0a41b-3334-4181-bad4-fd5cf030b010\") " pod="openstack/neutron-6097-account-create-update-jg6j9" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.255423 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4226x\" (UniqueName: \"kubernetes.io/projected/e6b0a41b-3334-4181-bad4-fd5cf030b010-kube-api-access-4226x\") pod \"neutron-6097-account-create-update-jg6j9\" (UID: \"e6b0a41b-3334-4181-bad4-fd5cf030b010\") " pod="openstack/neutron-6097-account-create-update-jg6j9" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.339194 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkk29\" (UniqueName: \"kubernetes.io/projected/103c33fd-ac95-4efe-b1c2-b6c9187eb613-kube-api-access-xkk29\") pod \"keystone-db-sync-9qc25\" (UID: \"103c33fd-ac95-4efe-b1c2-b6c9187eb613\") " pod="openstack/keystone-db-sync-9qc25" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.339278 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103c33fd-ac95-4efe-b1c2-b6c9187eb613-config-data\") pod \"keystone-db-sync-9qc25\" (UID: \"103c33fd-ac95-4efe-b1c2-b6c9187eb613\") " pod="openstack/keystone-db-sync-9qc25" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.339368 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103c33fd-ac95-4efe-b1c2-b6c9187eb613-combined-ca-bundle\") pod \"keystone-db-sync-9qc25\" (UID: \"103c33fd-ac95-4efe-b1c2-b6c9187eb613\") " pod="openstack/keystone-db-sync-9qc25" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.343156 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103c33fd-ac95-4efe-b1c2-b6c9187eb613-config-data\") pod \"keystone-db-sync-9qc25\" (UID: \"103c33fd-ac95-4efe-b1c2-b6c9187eb613\") " pod="openstack/keystone-db-sync-9qc25" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.343331 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103c33fd-ac95-4efe-b1c2-b6c9187eb613-combined-ca-bundle\") pod \"keystone-db-sync-9qc25\" (UID: \"103c33fd-ac95-4efe-b1c2-b6c9187eb613\") " pod="openstack/keystone-db-sync-9qc25" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.350060 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qr6rh" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.357619 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkk29\" (UniqueName: \"kubernetes.io/projected/103c33fd-ac95-4efe-b1c2-b6c9187eb613-kube-api-access-xkk29\") pod \"keystone-db-sync-9qc25\" (UID: \"103c33fd-ac95-4efe-b1c2-b6c9187eb613\") " pod="openstack/keystone-db-sync-9qc25" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.359494 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6097-account-create-update-jg6j9" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.391763 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.396232 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.492378 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9qc25" Dec 02 20:19:46 crc kubenswrapper[4807]: I1202 20:19:46.732509 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.326236 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8r86t"] Dec 02 20:19:49 crc kubenswrapper[4807]: W1202 20:19:49.344439 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9b198e4_d58c_4c1e_b6eb_35ecdeb30773.slice/crio-588e11ab75b7501dcbdc7c4846e5e09f80f53b4c0c9c60d7b1c2752fa85ac260 WatchSource:0}: Error finding container 588e11ab75b7501dcbdc7c4846e5e09f80f53b4c0c9c60d7b1c2752fa85ac260: Status 404 returned error can't find the container with id 588e11ab75b7501dcbdc7c4846e5e09f80f53b4c0c9c60d7b1c2752fa85ac260 Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.371467 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-95a6-account-create-update-5dpp4"] Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.387063 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.387375 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerName="prometheus" containerID="cri-o://ba542f9305c01578635207aaf81a2b12ce15b5cc9f58aaffe5924a71caa2fd08" gracePeriod=600 Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.387846 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerName="thanos-sidecar" containerID="cri-o://22e7dc6c074f0f2eaa1cd8563a89c09284ca142fd6144941b85127d45aac88d5" gracePeriod=600 Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.387916 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerName="config-reloader" containerID="cri-o://2ab661196fc48364552502b9353ac7d3586d02103c1567c89687234efef326b6" gracePeriod=600 Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.394031 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qr6rh"] Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.537857 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wmhlb"] Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.546839 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f2a8-account-create-update-7v4b4"] Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.560201 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6097-account-create-update-jg6j9"] Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.567685 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9qc25"] Dec 02 20:19:49 crc kubenswrapper[4807]: W1202 20:19:49.586012 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1141d1f9_6d7f_46fa_8e85_511b8a0adddf.slice/crio-fb466f3b61826f5d9d94c1b7c08acf5ebc88e135424c1b73763a6e8ef2b28130 WatchSource:0}: Error finding container fb466f3b61826f5d9d94c1b7c08acf5ebc88e135424c1b73763a6e8ef2b28130: Status 404 returned error can't find the container with id fb466f3b61826f5d9d94c1b7c08acf5ebc88e135424c1b73763a6e8ef2b28130 Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.598597 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 02 20:19:49 crc kubenswrapper[4807]: W1202 20:19:49.620772 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6b0a41b_3334_4181_bad4_fd5cf030b010.slice/crio-330304981f6399e1c8381a0b6b70edc451027961cc75001941999dbf23f22b85 WatchSource:0}: Error finding container 330304981f6399e1c8381a0b6b70edc451027961cc75001941999dbf23f22b85: Status 404 returned error can't find the container with id 330304981f6399e1c8381a0b6b70edc451027961cc75001941999dbf23f22b85 Dec 02 20:19:49 crc kubenswrapper[4807]: W1202 20:19:49.629319 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaaccc0b_6743_4907_9d87_f4ab26c931e2.slice/crio-a7bd4e0620cead827890c69f61c5c501a0db2e70c56d4ece1ae7ae8e0826de30 WatchSource:0}: Error finding container a7bd4e0620cead827890c69f61c5c501a0db2e70c56d4ece1ae7ae8e0826de30: Status 404 returned error can't find the container with id a7bd4e0620cead827890c69f61c5c501a0db2e70c56d4ece1ae7ae8e0826de30 Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.693639 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-vsnlc"] Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.765006 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6097-account-create-update-jg6j9" event={"ID":"e6b0a41b-3334-4181-bad4-fd5cf030b010","Type":"ContainerStarted","Data":"330304981f6399e1c8381a0b6b70edc451027961cc75001941999dbf23f22b85"} Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.766196 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"caaccc0b-6743-4907-9d87-f4ab26c931e2","Type":"ContainerStarted","Data":"a7bd4e0620cead827890c69f61c5c501a0db2e70c56d4ece1ae7ae8e0826de30"} Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.768354 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-trkqq" event={"ID":"53ba8115-bc84-45ea-804d-c29add38ee3a","Type":"ContainerStarted","Data":"57c32b143dbf76bcc9aceaabf44ea7e4c6f20502a7a47d4ec95ecb1f7e2e7dbe"} Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.778382 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-95a6-account-create-update-5dpp4" event={"ID":"38f1e4f9-4739-44a2-9257-389f73e77dfa","Type":"ContainerStarted","Data":"4246feddc5fbbfa4b83717bf8100b7a7c0675b43a326191b90be36d8c1e58dc0"} Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.780558 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wmhlb" event={"ID":"323b935c-8f72-4514-96c0-6edea05c6498","Type":"ContainerStarted","Data":"d43cdc8efb7be25b964c8a6421cc47fc731225af3aed41473f50e6141869ef77"} Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.786077 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f2a8-account-create-update-7v4b4" event={"ID":"1141d1f9-6d7f-46fa-8e85-511b8a0adddf","Type":"ContainerStarted","Data":"fb466f3b61826f5d9d94c1b7c08acf5ebc88e135424c1b73763a6e8ef2b28130"} Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.792585 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-trkqq" podStartSLOduration=3.014904644 podStartE2EDuration="16.792563654s" podCreationTimestamp="2025-12-02 20:19:33 +0000 UTC" firstStartedPulling="2025-12-02 20:19:34.784840261 +0000 UTC m=+1310.085747746" lastFinishedPulling="2025-12-02 20:19:48.562499261 +0000 UTC m=+1323.863406756" observedRunningTime="2025-12-02 20:19:49.786342459 +0000 UTC m=+1325.087249954" watchObservedRunningTime="2025-12-02 20:19:49.792563654 +0000 UTC m=+1325.093471149" Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.793538 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-vsnlc" event={"ID":"5e35946b-d565-4354-9c86-8eb06b4ed154","Type":"ContainerStarted","Data":"eca2c63658f78d557f21db2f86572d8858ad989e54e23d7c9177c44f6b22d686"} Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.802161 4807 generic.go:334] "Generic (PLEG): container finished" podID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerID="22e7dc6c074f0f2eaa1cd8563a89c09284ca142fd6144941b85127d45aac88d5" exitCode=0 Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.802205 4807 generic.go:334] "Generic (PLEG): container finished" podID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerID="ba542f9305c01578635207aaf81a2b12ce15b5cc9f58aaffe5924a71caa2fd08" exitCode=0 Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.802280 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89cb183e-cc2d-4fd9-90d8-212c434f0d06","Type":"ContainerDied","Data":"22e7dc6c074f0f2eaa1cd8563a89c09284ca142fd6144941b85127d45aac88d5"} Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.802305 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89cb183e-cc2d-4fd9-90d8-212c434f0d06","Type":"ContainerDied","Data":"ba542f9305c01578635207aaf81a2b12ce15b5cc9f58aaffe5924a71caa2fd08"} Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.815774 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9qc25" event={"ID":"103c33fd-ac95-4efe-b1c2-b6c9187eb613","Type":"ContainerStarted","Data":"c8b9fdc8d4c172b8c2fedef97bf44709d11827e6de9adc27a2993d488088b0b8"} Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.826732 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8r86t" event={"ID":"f9b198e4-d58c-4c1e-b6eb-35ecdeb30773","Type":"ContainerStarted","Data":"8845d76c1471ee4fb8c57841130d72656f50db5f461f581032375ef4ff478128"} Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.826786 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8r86t" event={"ID":"f9b198e4-d58c-4c1e-b6eb-35ecdeb30773","Type":"ContainerStarted","Data":"588e11ab75b7501dcbdc7c4846e5e09f80f53b4c0c9c60d7b1c2752fa85ac260"} Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.831258 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qr6rh" event={"ID":"aa51800c-1f7b-4000-8547-c391f3a6dc6a","Type":"ContainerStarted","Data":"460c89258ac4537be25e8a58eb3d0b6a760237ccc1334f37191a029031696d0c"} Dec 02 20:19:49 crc kubenswrapper[4807]: I1202 20:19:49.856294 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-8r86t" podStartSLOduration=4.856272032 podStartE2EDuration="4.856272032s" podCreationTimestamp="2025-12-02 20:19:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:19:49.845335416 +0000 UTC m=+1325.146242911" watchObservedRunningTime="2025-12-02 20:19:49.856272032 +0000 UTC m=+1325.157179527" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.621285 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.657963 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89cb183e-cc2d-4fd9-90d8-212c434f0d06-prometheus-metric-storage-rulefiles-0\") pod \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.658034 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-thanos-prometheus-http-client-file\") pod \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.658065 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89cb183e-cc2d-4fd9-90d8-212c434f0d06-tls-assets\") pod \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.658327 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") pod \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.658414 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89cb183e-cc2d-4fd9-90d8-212c434f0d06-config-out\") pod \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.658504 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj5f5\" (UniqueName: \"kubernetes.io/projected/89cb183e-cc2d-4fd9-90d8-212c434f0d06-kube-api-access-cj5f5\") pod \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.658575 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-web-config\") pod \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.658613 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-config\") pod \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\" (UID: \"89cb183e-cc2d-4fd9-90d8-212c434f0d06\") " Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.659363 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89cb183e-cc2d-4fd9-90d8-212c434f0d06-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "89cb183e-cc2d-4fd9-90d8-212c434f0d06" (UID: "89cb183e-cc2d-4fd9-90d8-212c434f0d06"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.672292 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89cb183e-cc2d-4fd9-90d8-212c434f0d06-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "89cb183e-cc2d-4fd9-90d8-212c434f0d06" (UID: "89cb183e-cc2d-4fd9-90d8-212c434f0d06"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.682701 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "89cb183e-cc2d-4fd9-90d8-212c434f0d06" (UID: "89cb183e-cc2d-4fd9-90d8-212c434f0d06"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.712292 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-config" (OuterVolumeSpecName: "config") pod "89cb183e-cc2d-4fd9-90d8-212c434f0d06" (UID: "89cb183e-cc2d-4fd9-90d8-212c434f0d06"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.719285 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89cb183e-cc2d-4fd9-90d8-212c434f0d06-config-out" (OuterVolumeSpecName: "config-out") pod "89cb183e-cc2d-4fd9-90d8-212c434f0d06" (UID: "89cb183e-cc2d-4fd9-90d8-212c434f0d06"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.720380 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-web-config" (OuterVolumeSpecName: "web-config") pod "89cb183e-cc2d-4fd9-90d8-212c434f0d06" (UID: "89cb183e-cc2d-4fd9-90d8-212c434f0d06"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.720972 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89cb183e-cc2d-4fd9-90d8-212c434f0d06-kube-api-access-cj5f5" (OuterVolumeSpecName: "kube-api-access-cj5f5") pod "89cb183e-cc2d-4fd9-90d8-212c434f0d06" (UID: "89cb183e-cc2d-4fd9-90d8-212c434f0d06"). InnerVolumeSpecName "kube-api-access-cj5f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.756415 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "89cb183e-cc2d-4fd9-90d8-212c434f0d06" (UID: "89cb183e-cc2d-4fd9-90d8-212c434f0d06"). InnerVolumeSpecName "pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.760364 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj5f5\" (UniqueName: \"kubernetes.io/projected/89cb183e-cc2d-4fd9-90d8-212c434f0d06-kube-api-access-cj5f5\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.760395 4807 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-web-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.760404 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.760414 4807 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89cb183e-cc2d-4fd9-90d8-212c434f0d06-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.760425 4807 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89cb183e-cc2d-4fd9-90d8-212c434f0d06-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.760434 4807 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89cb183e-cc2d-4fd9-90d8-212c434f0d06-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.760461 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") on node \"crc\" " Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.760473 4807 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89cb183e-cc2d-4fd9-90d8-212c434f0d06-config-out\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.779981 4807 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.780161 4807 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f") on node "crc" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.845780 4807 generic.go:334] "Generic (PLEG): container finished" podID="323b935c-8f72-4514-96c0-6edea05c6498" containerID="e1ffc619cf349acbec0f5074bf5d5d110a88db30412c2e9259695fbd0b9e8a82" exitCode=0 Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.845851 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wmhlb" event={"ID":"323b935c-8f72-4514-96c0-6edea05c6498","Type":"ContainerDied","Data":"e1ffc619cf349acbec0f5074bf5d5d110a88db30412c2e9259695fbd0b9e8a82"} Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.855324 4807 generic.go:334] "Generic (PLEG): container finished" podID="1141d1f9-6d7f-46fa-8e85-511b8a0adddf" containerID="9e84cbe403435241cff518569ceed84160782277981e5cccf2945cf119e89901" exitCode=0 Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.855402 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f2a8-account-create-update-7v4b4" event={"ID":"1141d1f9-6d7f-46fa-8e85-511b8a0adddf","Type":"ContainerDied","Data":"9e84cbe403435241cff518569ceed84160782277981e5cccf2945cf119e89901"} Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.857243 4807 generic.go:334] "Generic (PLEG): container finished" podID="f9b198e4-d58c-4c1e-b6eb-35ecdeb30773" containerID="8845d76c1471ee4fb8c57841130d72656f50db5f461f581032375ef4ff478128" exitCode=0 Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.857327 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8r86t" event={"ID":"f9b198e4-d58c-4c1e-b6eb-35ecdeb30773","Type":"ContainerDied","Data":"8845d76c1471ee4fb8c57841130d72656f50db5f461f581032375ef4ff478128"} Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.862573 4807 reconciler_common.go:293] "Volume detached for volume \"pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.862755 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6097-account-create-update-jg6j9" event={"ID":"e6b0a41b-3334-4181-bad4-fd5cf030b010","Type":"ContainerDied","Data":"5df04e2fedee99698edbf2af49a68c5e8ed7ed34631fbe29a1cd9a7a62093528"} Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.863327 4807 generic.go:334] "Generic (PLEG): container finished" podID="e6b0a41b-3334-4181-bad4-fd5cf030b010" containerID="5df04e2fedee99698edbf2af49a68c5e8ed7ed34631fbe29a1cd9a7a62093528" exitCode=0 Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.871750 4807 generic.go:334] "Generic (PLEG): container finished" podID="aa51800c-1f7b-4000-8547-c391f3a6dc6a" containerID="e6f312279c514e140ef23636de13a4ac7c3610fd4b9f4b42203abff2be8720fa" exitCode=0 Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.871948 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qr6rh" event={"ID":"aa51800c-1f7b-4000-8547-c391f3a6dc6a","Type":"ContainerDied","Data":"e6f312279c514e140ef23636de13a4ac7c3610fd4b9f4b42203abff2be8720fa"} Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.876106 4807 generic.go:334] "Generic (PLEG): container finished" podID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerID="2ab661196fc48364552502b9353ac7d3586d02103c1567c89687234efef326b6" exitCode=0 Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.876216 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89cb183e-cc2d-4fd9-90d8-212c434f0d06","Type":"ContainerDied","Data":"2ab661196fc48364552502b9353ac7d3586d02103c1567c89687234efef326b6"} Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.876236 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.876267 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"89cb183e-cc2d-4fd9-90d8-212c434f0d06","Type":"ContainerDied","Data":"663ce3df155777da12262f64155ce07df171f9bf4556ecc619de557d6233e6a0"} Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.876290 4807 scope.go:117] "RemoveContainer" containerID="22e7dc6c074f0f2eaa1cd8563a89c09284ca142fd6144941b85127d45aac88d5" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.883742 4807 generic.go:334] "Generic (PLEG): container finished" podID="38f1e4f9-4739-44a2-9257-389f73e77dfa" containerID="11ad1232df83790d17009b657198dcfdbe9df3d1e34c183a2902338b24e2cdfd" exitCode=0 Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.884202 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-95a6-account-create-update-5dpp4" event={"ID":"38f1e4f9-4739-44a2-9257-389f73e77dfa","Type":"ContainerDied","Data":"11ad1232df83790d17009b657198dcfdbe9df3d1e34c183a2902338b24e2cdfd"} Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.946092 4807 scope.go:117] "RemoveContainer" containerID="2ab661196fc48364552502b9353ac7d3586d02103c1567c89687234efef326b6" Dec 02 20:19:50 crc kubenswrapper[4807]: I1202 20:19:50.981488 4807 scope.go:117] "RemoveContainer" containerID="ba542f9305c01578635207aaf81a2b12ce15b5cc9f58aaffe5924a71caa2fd08" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.017787 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.034840 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.042316 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 20:19:51 crc kubenswrapper[4807]: E1202 20:19:51.042866 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerName="prometheus" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.042900 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerName="prometheus" Dec 02 20:19:51 crc kubenswrapper[4807]: E1202 20:19:51.042921 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerName="config-reloader" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.042929 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerName="config-reloader" Dec 02 20:19:51 crc kubenswrapper[4807]: E1202 20:19:51.042951 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerName="thanos-sidecar" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.042957 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerName="thanos-sidecar" Dec 02 20:19:51 crc kubenswrapper[4807]: E1202 20:19:51.042981 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerName="init-config-reloader" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.042987 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerName="init-config-reloader" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.043164 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerName="config-reloader" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.043178 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerName="thanos-sidecar" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.043192 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" containerName="prometheus" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.044979 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.051187 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.053224 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.053481 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.054144 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.054428 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.054603 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.054807 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-nlqxt" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.065935 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.169626 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a52102ed-4584-44cb-b051-4f08f862d64d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.169687 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.169727 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m4vb\" (UniqueName: \"kubernetes.io/projected/a52102ed-4584-44cb-b051-4f08f862d64d-kube-api-access-7m4vb\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.169757 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a52102ed-4584-44cb-b051-4f08f862d64d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.169798 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a52102ed-4584-44cb-b051-4f08f862d64d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.169831 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.169852 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.169882 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.169946 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-config\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.169964 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.169988 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.272162 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-config\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.272237 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.272274 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.272338 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a52102ed-4584-44cb-b051-4f08f862d64d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.272365 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.272395 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m4vb\" (UniqueName: \"kubernetes.io/projected/a52102ed-4584-44cb-b051-4f08f862d64d-kube-api-access-7m4vb\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.272430 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a52102ed-4584-44cb-b051-4f08f862d64d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.272478 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a52102ed-4584-44cb-b051-4f08f862d64d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.272518 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.272547 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.272581 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.278474 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a52102ed-4584-44cb-b051-4f08f862d64d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.278838 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-config\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.279625 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a52102ed-4584-44cb-b051-4f08f862d64d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.281134 4807 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.281167 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.281193 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8dd81587a0a3f5d67a5d533af4320b55477e158168be00efde5dc29e79f819c0/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.281668 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.282107 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.283506 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.284266 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a52102ed-4584-44cb-b051-4f08f862d64d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.288510 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.297029 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m4vb\" (UniqueName: \"kubernetes.io/projected/a52102ed-4584-44cb-b051-4f08f862d64d-kube-api-access-7m4vb\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.324587 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") pod \"prometheus-metric-storage-0\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.394972 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.556042 4807 scope.go:117] "RemoveContainer" containerID="616e718fa94758b80bd2d67a02d049c69f5f50978b029f71a4580dd58861f362" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.620548 4807 scope.go:117] "RemoveContainer" containerID="22e7dc6c074f0f2eaa1cd8563a89c09284ca142fd6144941b85127d45aac88d5" Dec 02 20:19:51 crc kubenswrapper[4807]: E1202 20:19:51.621288 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e7dc6c074f0f2eaa1cd8563a89c09284ca142fd6144941b85127d45aac88d5\": container with ID starting with 22e7dc6c074f0f2eaa1cd8563a89c09284ca142fd6144941b85127d45aac88d5 not found: ID does not exist" containerID="22e7dc6c074f0f2eaa1cd8563a89c09284ca142fd6144941b85127d45aac88d5" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.621321 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e7dc6c074f0f2eaa1cd8563a89c09284ca142fd6144941b85127d45aac88d5"} err="failed to get container status \"22e7dc6c074f0f2eaa1cd8563a89c09284ca142fd6144941b85127d45aac88d5\": rpc error: code = NotFound desc = could not find container \"22e7dc6c074f0f2eaa1cd8563a89c09284ca142fd6144941b85127d45aac88d5\": container with ID starting with 22e7dc6c074f0f2eaa1cd8563a89c09284ca142fd6144941b85127d45aac88d5 not found: ID does not exist" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.621342 4807 scope.go:117] "RemoveContainer" containerID="2ab661196fc48364552502b9353ac7d3586d02103c1567c89687234efef326b6" Dec 02 20:19:51 crc kubenswrapper[4807]: E1202 20:19:51.621675 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ab661196fc48364552502b9353ac7d3586d02103c1567c89687234efef326b6\": container with ID starting with 2ab661196fc48364552502b9353ac7d3586d02103c1567c89687234efef326b6 not found: ID does not exist" containerID="2ab661196fc48364552502b9353ac7d3586d02103c1567c89687234efef326b6" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.621701 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab661196fc48364552502b9353ac7d3586d02103c1567c89687234efef326b6"} err="failed to get container status \"2ab661196fc48364552502b9353ac7d3586d02103c1567c89687234efef326b6\": rpc error: code = NotFound desc = could not find container \"2ab661196fc48364552502b9353ac7d3586d02103c1567c89687234efef326b6\": container with ID starting with 2ab661196fc48364552502b9353ac7d3586d02103c1567c89687234efef326b6 not found: ID does not exist" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.621728 4807 scope.go:117] "RemoveContainer" containerID="ba542f9305c01578635207aaf81a2b12ce15b5cc9f58aaffe5924a71caa2fd08" Dec 02 20:19:51 crc kubenswrapper[4807]: E1202 20:19:51.622077 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba542f9305c01578635207aaf81a2b12ce15b5cc9f58aaffe5924a71caa2fd08\": container with ID starting with ba542f9305c01578635207aaf81a2b12ce15b5cc9f58aaffe5924a71caa2fd08 not found: ID does not exist" containerID="ba542f9305c01578635207aaf81a2b12ce15b5cc9f58aaffe5924a71caa2fd08" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.622120 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba542f9305c01578635207aaf81a2b12ce15b5cc9f58aaffe5924a71caa2fd08"} err="failed to get container status \"ba542f9305c01578635207aaf81a2b12ce15b5cc9f58aaffe5924a71caa2fd08\": rpc error: code = NotFound desc = could not find container \"ba542f9305c01578635207aaf81a2b12ce15b5cc9f58aaffe5924a71caa2fd08\": container with ID starting with ba542f9305c01578635207aaf81a2b12ce15b5cc9f58aaffe5924a71caa2fd08 not found: ID does not exist" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.622146 4807 scope.go:117] "RemoveContainer" containerID="616e718fa94758b80bd2d67a02d049c69f5f50978b029f71a4580dd58861f362" Dec 02 20:19:51 crc kubenswrapper[4807]: E1202 20:19:51.622453 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"616e718fa94758b80bd2d67a02d049c69f5f50978b029f71a4580dd58861f362\": container with ID starting with 616e718fa94758b80bd2d67a02d049c69f5f50978b029f71a4580dd58861f362 not found: ID does not exist" containerID="616e718fa94758b80bd2d67a02d049c69f5f50978b029f71a4580dd58861f362" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.622470 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"616e718fa94758b80bd2d67a02d049c69f5f50978b029f71a4580dd58861f362"} err="failed to get container status \"616e718fa94758b80bd2d67a02d049c69f5f50978b029f71a4580dd58861f362\": rpc error: code = NotFound desc = could not find container \"616e718fa94758b80bd2d67a02d049c69f5f50978b029f71a4580dd58861f362\": container with ID starting with 616e718fa94758b80bd2d67a02d049c69f5f50978b029f71a4580dd58861f362 not found: ID does not exist" Dec 02 20:19:51 crc kubenswrapper[4807]: I1202 20:19:51.907397 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"caaccc0b-6743-4907-9d87-f4ab26c931e2","Type":"ContainerStarted","Data":"2e3d73a43e41f6ecf17582d2b0db346ad2c577fbb6ff35910685064278369ed5"} Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.151842 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 20:19:52 crc kubenswrapper[4807]: W1202 20:19:52.165029 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda52102ed_4584_44cb_b051_4f08f862d64d.slice/crio-4a7eeb63fd16d69630aea44b87cbfa4634a598770bfb58af595b27d41e9a0e2f WatchSource:0}: Error finding container 4a7eeb63fd16d69630aea44b87cbfa4634a598770bfb58af595b27d41e9a0e2f: Status 404 returned error can't find the container with id 4a7eeb63fd16d69630aea44b87cbfa4634a598770bfb58af595b27d41e9a0e2f Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.247276 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-95a6-account-create-update-5dpp4" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.294533 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzr8c\" (UniqueName: \"kubernetes.io/projected/38f1e4f9-4739-44a2-9257-389f73e77dfa-kube-api-access-fzr8c\") pod \"38f1e4f9-4739-44a2-9257-389f73e77dfa\" (UID: \"38f1e4f9-4739-44a2-9257-389f73e77dfa\") " Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.295093 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38f1e4f9-4739-44a2-9257-389f73e77dfa-operator-scripts\") pod \"38f1e4f9-4739-44a2-9257-389f73e77dfa\" (UID: \"38f1e4f9-4739-44a2-9257-389f73e77dfa\") " Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.296092 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38f1e4f9-4739-44a2-9257-389f73e77dfa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38f1e4f9-4739-44a2-9257-389f73e77dfa" (UID: "38f1e4f9-4739-44a2-9257-389f73e77dfa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.296759 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38f1e4f9-4739-44a2-9257-389f73e77dfa-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.300031 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f1e4f9-4739-44a2-9257-389f73e77dfa-kube-api-access-fzr8c" (OuterVolumeSpecName: "kube-api-access-fzr8c") pod "38f1e4f9-4739-44a2-9257-389f73e77dfa" (UID: "38f1e4f9-4739-44a2-9257-389f73e77dfa"). InnerVolumeSpecName "kube-api-access-fzr8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.399473 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzr8c\" (UniqueName: \"kubernetes.io/projected/38f1e4f9-4739-44a2-9257-389f73e77dfa-kube-api-access-fzr8c\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.695418 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6097-account-create-update-jg6j9" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.705636 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8r86t" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.716496 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f2a8-account-create-update-7v4b4" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.766482 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wmhlb" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.766483 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qr6rh" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.816283 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v48s\" (UniqueName: \"kubernetes.io/projected/f9b198e4-d58c-4c1e-b6eb-35ecdeb30773-kube-api-access-5v48s\") pod \"f9b198e4-d58c-4c1e-b6eb-35ecdeb30773\" (UID: \"f9b198e4-d58c-4c1e-b6eb-35ecdeb30773\") " Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.816325 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkdk2\" (UniqueName: \"kubernetes.io/projected/323b935c-8f72-4514-96c0-6edea05c6498-kube-api-access-pkdk2\") pod \"323b935c-8f72-4514-96c0-6edea05c6498\" (UID: \"323b935c-8f72-4514-96c0-6edea05c6498\") " Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.816404 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9b198e4-d58c-4c1e-b6eb-35ecdeb30773-operator-scripts\") pod \"f9b198e4-d58c-4c1e-b6eb-35ecdeb30773\" (UID: \"f9b198e4-d58c-4c1e-b6eb-35ecdeb30773\") " Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.816503 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wr7c\" (UniqueName: \"kubernetes.io/projected/1141d1f9-6d7f-46fa-8e85-511b8a0adddf-kube-api-access-5wr7c\") pod \"1141d1f9-6d7f-46fa-8e85-511b8a0adddf\" (UID: \"1141d1f9-6d7f-46fa-8e85-511b8a0adddf\") " Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.816544 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4226x\" (UniqueName: \"kubernetes.io/projected/e6b0a41b-3334-4181-bad4-fd5cf030b010-kube-api-access-4226x\") pod \"e6b0a41b-3334-4181-bad4-fd5cf030b010\" (UID: \"e6b0a41b-3334-4181-bad4-fd5cf030b010\") " Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.816576 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v868\" (UniqueName: \"kubernetes.io/projected/aa51800c-1f7b-4000-8547-c391f3a6dc6a-kube-api-access-9v868\") pod \"aa51800c-1f7b-4000-8547-c391f3a6dc6a\" (UID: \"aa51800c-1f7b-4000-8547-c391f3a6dc6a\") " Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.816627 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b0a41b-3334-4181-bad4-fd5cf030b010-operator-scripts\") pod \"e6b0a41b-3334-4181-bad4-fd5cf030b010\" (UID: \"e6b0a41b-3334-4181-bad4-fd5cf030b010\") " Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.816655 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa51800c-1f7b-4000-8547-c391f3a6dc6a-operator-scripts\") pod \"aa51800c-1f7b-4000-8547-c391f3a6dc6a\" (UID: \"aa51800c-1f7b-4000-8547-c391f3a6dc6a\") " Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.816683 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/323b935c-8f72-4514-96c0-6edea05c6498-operator-scripts\") pod \"323b935c-8f72-4514-96c0-6edea05c6498\" (UID: \"323b935c-8f72-4514-96c0-6edea05c6498\") " Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.816755 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1141d1f9-6d7f-46fa-8e85-511b8a0adddf-operator-scripts\") pod \"1141d1f9-6d7f-46fa-8e85-511b8a0adddf\" (UID: \"1141d1f9-6d7f-46fa-8e85-511b8a0adddf\") " Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.818137 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1141d1f9-6d7f-46fa-8e85-511b8a0adddf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1141d1f9-6d7f-46fa-8e85-511b8a0adddf" (UID: "1141d1f9-6d7f-46fa-8e85-511b8a0adddf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.820415 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b0a41b-3334-4181-bad4-fd5cf030b010-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6b0a41b-3334-4181-bad4-fd5cf030b010" (UID: "e6b0a41b-3334-4181-bad4-fd5cf030b010"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.820548 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9b198e4-d58c-4c1e-b6eb-35ecdeb30773-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9b198e4-d58c-4c1e-b6eb-35ecdeb30773" (UID: "f9b198e4-d58c-4c1e-b6eb-35ecdeb30773"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.820854 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/323b935c-8f72-4514-96c0-6edea05c6498-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "323b935c-8f72-4514-96c0-6edea05c6498" (UID: "323b935c-8f72-4514-96c0-6edea05c6498"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.821023 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa51800c-1f7b-4000-8547-c391f3a6dc6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aa51800c-1f7b-4000-8547-c391f3a6dc6a" (UID: "aa51800c-1f7b-4000-8547-c391f3a6dc6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.826925 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa51800c-1f7b-4000-8547-c391f3a6dc6a-kube-api-access-9v868" (OuterVolumeSpecName: "kube-api-access-9v868") pod "aa51800c-1f7b-4000-8547-c391f3a6dc6a" (UID: "aa51800c-1f7b-4000-8547-c391f3a6dc6a"). InnerVolumeSpecName "kube-api-access-9v868". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.827111 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1141d1f9-6d7f-46fa-8e85-511b8a0adddf-kube-api-access-5wr7c" (OuterVolumeSpecName: "kube-api-access-5wr7c") pod "1141d1f9-6d7f-46fa-8e85-511b8a0adddf" (UID: "1141d1f9-6d7f-46fa-8e85-511b8a0adddf"). InnerVolumeSpecName "kube-api-access-5wr7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.828456 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b0a41b-3334-4181-bad4-fd5cf030b010-kube-api-access-4226x" (OuterVolumeSpecName: "kube-api-access-4226x") pod "e6b0a41b-3334-4181-bad4-fd5cf030b010" (UID: "e6b0a41b-3334-4181-bad4-fd5cf030b010"). InnerVolumeSpecName "kube-api-access-4226x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.828951 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/323b935c-8f72-4514-96c0-6edea05c6498-kube-api-access-pkdk2" (OuterVolumeSpecName: "kube-api-access-pkdk2") pod "323b935c-8f72-4514-96c0-6edea05c6498" (UID: "323b935c-8f72-4514-96c0-6edea05c6498"). InnerVolumeSpecName "kube-api-access-pkdk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.831545 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b198e4-d58c-4c1e-b6eb-35ecdeb30773-kube-api-access-5v48s" (OuterVolumeSpecName: "kube-api-access-5v48s") pod "f9b198e4-d58c-4c1e-b6eb-35ecdeb30773" (UID: "f9b198e4-d58c-4c1e-b6eb-35ecdeb30773"). InnerVolumeSpecName "kube-api-access-5v48s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.919128 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wr7c\" (UniqueName: \"kubernetes.io/projected/1141d1f9-6d7f-46fa-8e85-511b8a0adddf-kube-api-access-5wr7c\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.919178 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4226x\" (UniqueName: \"kubernetes.io/projected/e6b0a41b-3334-4181-bad4-fd5cf030b010-kube-api-access-4226x\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.919190 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v868\" (UniqueName: \"kubernetes.io/projected/aa51800c-1f7b-4000-8547-c391f3a6dc6a-kube-api-access-9v868\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.919202 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6b0a41b-3334-4181-bad4-fd5cf030b010-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.919213 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa51800c-1f7b-4000-8547-c391f3a6dc6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.919224 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/323b935c-8f72-4514-96c0-6edea05c6498-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.919234 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1141d1f9-6d7f-46fa-8e85-511b8a0adddf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.919245 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v48s\" (UniqueName: \"kubernetes.io/projected/f9b198e4-d58c-4c1e-b6eb-35ecdeb30773-kube-api-access-5v48s\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.919257 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkdk2\" (UniqueName: \"kubernetes.io/projected/323b935c-8f72-4514-96c0-6edea05c6498-kube-api-access-pkdk2\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.919268 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9b198e4-d58c-4c1e-b6eb-35ecdeb30773-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.946958 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f2a8-account-create-update-7v4b4" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.946996 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f2a8-account-create-update-7v4b4" event={"ID":"1141d1f9-6d7f-46fa-8e85-511b8a0adddf","Type":"ContainerDied","Data":"fb466f3b61826f5d9d94c1b7c08acf5ebc88e135424c1b73763a6e8ef2b28130"} Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.947109 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb466f3b61826f5d9d94c1b7c08acf5ebc88e135424c1b73763a6e8ef2b28130" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.950919 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8r86t" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.950903 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8r86t" event={"ID":"f9b198e4-d58c-4c1e-b6eb-35ecdeb30773","Type":"ContainerDied","Data":"588e11ab75b7501dcbdc7c4846e5e09f80f53b4c0c9c60d7b1c2752fa85ac260"} Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.951230 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="588e11ab75b7501dcbdc7c4846e5e09f80f53b4c0c9c60d7b1c2752fa85ac260" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.957261 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6097-account-create-update-jg6j9" event={"ID":"e6b0a41b-3334-4181-bad4-fd5cf030b010","Type":"ContainerDied","Data":"330304981f6399e1c8381a0b6b70edc451027961cc75001941999dbf23f22b85"} Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.957317 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="330304981f6399e1c8381a0b6b70edc451027961cc75001941999dbf23f22b85" Dec 02 20:19:52 crc kubenswrapper[4807]: I1202 20:19:52.957443 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6097-account-create-update-jg6j9" Dec 02 20:19:53 crc kubenswrapper[4807]: I1202 20:19:53.000874 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qr6rh" Dec 02 20:19:53 crc kubenswrapper[4807]: I1202 20:19:53.078487 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89cb183e-cc2d-4fd9-90d8-212c434f0d06" path="/var/lib/kubelet/pods/89cb183e-cc2d-4fd9-90d8-212c434f0d06/volumes" Dec 02 20:19:53 crc kubenswrapper[4807]: I1202 20:19:53.085022 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a52102ed-4584-44cb-b051-4f08f862d64d","Type":"ContainerStarted","Data":"4a7eeb63fd16d69630aea44b87cbfa4634a598770bfb58af595b27d41e9a0e2f"} Dec 02 20:19:53 crc kubenswrapper[4807]: I1202 20:19:53.085069 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qr6rh" event={"ID":"aa51800c-1f7b-4000-8547-c391f3a6dc6a","Type":"ContainerDied","Data":"460c89258ac4537be25e8a58eb3d0b6a760237ccc1334f37191a029031696d0c"} Dec 02 20:19:53 crc kubenswrapper[4807]: I1202 20:19:53.085092 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="460c89258ac4537be25e8a58eb3d0b6a760237ccc1334f37191a029031696d0c" Dec 02 20:19:53 crc kubenswrapper[4807]: I1202 20:19:53.085103 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"caaccc0b-6743-4907-9d87-f4ab26c931e2","Type":"ContainerStarted","Data":"d671ba5f5a2b2735748090c5622d19706d9f8d6073ef9231c397c299bdf4ac47"} Dec 02 20:19:53 crc kubenswrapper[4807]: I1202 20:19:53.085118 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"caaccc0b-6743-4907-9d87-f4ab26c931e2","Type":"ContainerStarted","Data":"b8554a71141cf1bc97dc1777e630fecc309ad7fb8b2503ebbe7333c8b2400008"} Dec 02 20:19:53 crc kubenswrapper[4807]: I1202 20:19:53.088138 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-95a6-account-create-update-5dpp4" event={"ID":"38f1e4f9-4739-44a2-9257-389f73e77dfa","Type":"ContainerDied","Data":"4246feddc5fbbfa4b83717bf8100b7a7c0675b43a326191b90be36d8c1e58dc0"} Dec 02 20:19:53 crc kubenswrapper[4807]: I1202 20:19:53.088194 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4246feddc5fbbfa4b83717bf8100b7a7c0675b43a326191b90be36d8c1e58dc0" Dec 02 20:19:53 crc kubenswrapper[4807]: I1202 20:19:53.088279 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-95a6-account-create-update-5dpp4" Dec 02 20:19:53 crc kubenswrapper[4807]: I1202 20:19:53.100024 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wmhlb" event={"ID":"323b935c-8f72-4514-96c0-6edea05c6498","Type":"ContainerDied","Data":"d43cdc8efb7be25b964c8a6421cc47fc731225af3aed41473f50e6141869ef77"} Dec 02 20:19:53 crc kubenswrapper[4807]: I1202 20:19:53.100079 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d43cdc8efb7be25b964c8a6421cc47fc731225af3aed41473f50e6141869ef77" Dec 02 20:19:53 crc kubenswrapper[4807]: I1202 20:19:53.100171 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wmhlb" Dec 02 20:19:54 crc kubenswrapper[4807]: I1202 20:19:54.115081 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"caaccc0b-6743-4907-9d87-f4ab26c931e2","Type":"ContainerStarted","Data":"330b5ecbe6e00dac6267a447665f202dc5b0a2624d0a020681bcf1973523fe84"} Dec 02 20:19:55 crc kubenswrapper[4807]: I1202 20:19:55.128252 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a52102ed-4584-44cb-b051-4f08f862d64d","Type":"ContainerStarted","Data":"9a669c7afde69cecd7e6b175cf737aff63fc1a82bbb1776856e4b6a7f335343e"} Dec 02 20:19:58 crc kubenswrapper[4807]: I1202 20:19:58.293075 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:19:58 crc kubenswrapper[4807]: I1202 20:19:58.293479 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:20:02 crc kubenswrapper[4807]: I1202 20:20:02.211967 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-trkqq" event={"ID":"53ba8115-bc84-45ea-804d-c29add38ee3a","Type":"ContainerDied","Data":"57c32b143dbf76bcc9aceaabf44ea7e4c6f20502a7a47d4ec95ecb1f7e2e7dbe"} Dec 02 20:20:02 crc kubenswrapper[4807]: I1202 20:20:02.211902 4807 generic.go:334] "Generic (PLEG): container finished" podID="53ba8115-bc84-45ea-804d-c29add38ee3a" containerID="57c32b143dbf76bcc9aceaabf44ea7e4c6f20502a7a47d4ec95ecb1f7e2e7dbe" exitCode=0 Dec 02 20:20:03 crc kubenswrapper[4807]: I1202 20:20:03.224897 4807 generic.go:334] "Generic (PLEG): container finished" podID="a52102ed-4584-44cb-b051-4f08f862d64d" containerID="9a669c7afde69cecd7e6b175cf737aff63fc1a82bbb1776856e4b6a7f335343e" exitCode=0 Dec 02 20:20:03 crc kubenswrapper[4807]: I1202 20:20:03.224974 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a52102ed-4584-44cb-b051-4f08f862d64d","Type":"ContainerDied","Data":"9a669c7afde69cecd7e6b175cf737aff63fc1a82bbb1776856e4b6a7f335343e"} Dec 02 20:20:09 crc kubenswrapper[4807]: E1202 20:20:09.386689 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest" Dec 02 20:20:09 crc kubenswrapper[4807]: E1202 20:20:09.387666 4807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest" Dec 02 20:20:09 crc kubenswrapper[4807]: E1202 20:20:09.387857 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-db-sync,Image:38.102.83.5:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/watcher/watcher.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6pfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-db-sync-vsnlc_openstack(5e35946b-d565-4354-9c86-8eb06b4ed154): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 20:20:09 crc kubenswrapper[4807]: E1202 20:20:09.389082 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/watcher-db-sync-vsnlc" podUID="5e35946b-d565-4354-9c86-8eb06b4ed154" Dec 02 20:20:09 crc kubenswrapper[4807]: I1202 20:20:09.557340 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-trkqq" Dec 02 20:20:09 crc kubenswrapper[4807]: I1202 20:20:09.714251 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t62b2\" (UniqueName: \"kubernetes.io/projected/53ba8115-bc84-45ea-804d-c29add38ee3a-kube-api-access-t62b2\") pod \"53ba8115-bc84-45ea-804d-c29add38ee3a\" (UID: \"53ba8115-bc84-45ea-804d-c29add38ee3a\") " Dec 02 20:20:09 crc kubenswrapper[4807]: I1202 20:20:09.714633 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-db-sync-config-data\") pod \"53ba8115-bc84-45ea-804d-c29add38ee3a\" (UID: \"53ba8115-bc84-45ea-804d-c29add38ee3a\") " Dec 02 20:20:09 crc kubenswrapper[4807]: I1202 20:20:09.714828 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-config-data\") pod \"53ba8115-bc84-45ea-804d-c29add38ee3a\" (UID: \"53ba8115-bc84-45ea-804d-c29add38ee3a\") " Dec 02 20:20:09 crc kubenswrapper[4807]: I1202 20:20:09.714857 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-combined-ca-bundle\") pod \"53ba8115-bc84-45ea-804d-c29add38ee3a\" (UID: \"53ba8115-bc84-45ea-804d-c29add38ee3a\") " Dec 02 20:20:09 crc kubenswrapper[4807]: I1202 20:20:09.720679 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "53ba8115-bc84-45ea-804d-c29add38ee3a" (UID: "53ba8115-bc84-45ea-804d-c29add38ee3a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:09 crc kubenswrapper[4807]: I1202 20:20:09.720799 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53ba8115-bc84-45ea-804d-c29add38ee3a-kube-api-access-t62b2" (OuterVolumeSpecName: "kube-api-access-t62b2") pod "53ba8115-bc84-45ea-804d-c29add38ee3a" (UID: "53ba8115-bc84-45ea-804d-c29add38ee3a"). InnerVolumeSpecName "kube-api-access-t62b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:20:09 crc kubenswrapper[4807]: I1202 20:20:09.757009 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53ba8115-bc84-45ea-804d-c29add38ee3a" (UID: "53ba8115-bc84-45ea-804d-c29add38ee3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:09 crc kubenswrapper[4807]: I1202 20:20:09.802820 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-config-data" (OuterVolumeSpecName: "config-data") pod "53ba8115-bc84-45ea-804d-c29add38ee3a" (UID: "53ba8115-bc84-45ea-804d-c29add38ee3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:09 crc kubenswrapper[4807]: I1202 20:20:09.817676 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:09 crc kubenswrapper[4807]: I1202 20:20:09.817753 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:09 crc kubenswrapper[4807]: I1202 20:20:09.817773 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t62b2\" (UniqueName: \"kubernetes.io/projected/53ba8115-bc84-45ea-804d-c29add38ee3a-kube-api-access-t62b2\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:09 crc kubenswrapper[4807]: I1202 20:20:09.817791 4807 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/53ba8115-bc84-45ea-804d-c29add38ee3a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:10 crc kubenswrapper[4807]: I1202 20:20:10.342333 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a52102ed-4584-44cb-b051-4f08f862d64d","Type":"ContainerStarted","Data":"7327b5e7d54721e1df6f00230469ab9101807546a999fae341c6b722873e27b7"} Dec 02 20:20:10 crc kubenswrapper[4807]: I1202 20:20:10.346880 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"caaccc0b-6743-4907-9d87-f4ab26c931e2","Type":"ContainerStarted","Data":"e1366b64af9e1126455de8dd813214bff4305c3cc67b335401558ec9bb7571f8"} Dec 02 20:20:10 crc kubenswrapper[4807]: I1202 20:20:10.347056 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"caaccc0b-6743-4907-9d87-f4ab26c931e2","Type":"ContainerStarted","Data":"0d1335209db8a2989e7cc20e07476b2f1cce0a368a34ee112b685ba421c3ca30"} Dec 02 20:20:10 crc kubenswrapper[4807]: I1202 20:20:10.347170 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"caaccc0b-6743-4907-9d87-f4ab26c931e2","Type":"ContainerStarted","Data":"5c35438e688835277336c389ecfd15042a8eafb48efa8cb2e64da9a0969e91a5"} Dec 02 20:20:10 crc kubenswrapper[4807]: I1202 20:20:10.349498 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-trkqq" event={"ID":"53ba8115-bc84-45ea-804d-c29add38ee3a","Type":"ContainerDied","Data":"2dfb6444d84c6e4ad81c65cc59370fe3a41db0dbf816a0b9e658e484ca42f3ed"} Dec 02 20:20:10 crc kubenswrapper[4807]: I1202 20:20:10.349523 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-trkqq" Dec 02 20:20:10 crc kubenswrapper[4807]: I1202 20:20:10.349533 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dfb6444d84c6e4ad81c65cc59370fe3a41db0dbf816a0b9e658e484ca42f3ed" Dec 02 20:20:10 crc kubenswrapper[4807]: I1202 20:20:10.352618 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9qc25" event={"ID":"103c33fd-ac95-4efe-b1c2-b6c9187eb613","Type":"ContainerStarted","Data":"c6f655e62bc881fdb84a50db51f44c7de530390cd4b3965f1d2c5e41583e4ac4"} Dec 02 20:20:10 crc kubenswrapper[4807]: E1202 20:20:10.353677 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/podified-epoxy-centos9/openstack-watcher-api:watcher_latest\\\"\"" pod="openstack/watcher-db-sync-vsnlc" podUID="5e35946b-d565-4354-9c86-8eb06b4ed154" Dec 02 20:20:10 crc kubenswrapper[4807]: I1202 20:20:10.370267 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-9qc25" podStartSLOduration=4.591342908 podStartE2EDuration="24.370246222s" podCreationTimestamp="2025-12-02 20:19:46 +0000 UTC" firstStartedPulling="2025-12-02 20:19:49.614555681 +0000 UTC m=+1324.915463176" lastFinishedPulling="2025-12-02 20:20:09.393458985 +0000 UTC m=+1344.694366490" observedRunningTime="2025-12-02 20:20:10.367572222 +0000 UTC m=+1345.668479717" watchObservedRunningTime="2025-12-02 20:20:10.370246222 +0000 UTC m=+1345.671153717" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.148004 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-7qr5t"] Dec 02 20:20:11 crc kubenswrapper[4807]: E1202 20:20:11.153550 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa51800c-1f7b-4000-8547-c391f3a6dc6a" containerName="mariadb-database-create" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.153577 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa51800c-1f7b-4000-8547-c391f3a6dc6a" containerName="mariadb-database-create" Dec 02 20:20:11 crc kubenswrapper[4807]: E1202 20:20:11.153614 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53ba8115-bc84-45ea-804d-c29add38ee3a" containerName="glance-db-sync" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.153623 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="53ba8115-bc84-45ea-804d-c29add38ee3a" containerName="glance-db-sync" Dec 02 20:20:11 crc kubenswrapper[4807]: E1202 20:20:11.153644 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="323b935c-8f72-4514-96c0-6edea05c6498" containerName="mariadb-database-create" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.153651 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="323b935c-8f72-4514-96c0-6edea05c6498" containerName="mariadb-database-create" Dec 02 20:20:11 crc kubenswrapper[4807]: E1202 20:20:11.153672 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1141d1f9-6d7f-46fa-8e85-511b8a0adddf" containerName="mariadb-account-create-update" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.153683 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="1141d1f9-6d7f-46fa-8e85-511b8a0adddf" containerName="mariadb-account-create-update" Dec 02 20:20:11 crc kubenswrapper[4807]: E1202 20:20:11.153705 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f1e4f9-4739-44a2-9257-389f73e77dfa" containerName="mariadb-account-create-update" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.153729 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f1e4f9-4739-44a2-9257-389f73e77dfa" containerName="mariadb-account-create-update" Dec 02 20:20:11 crc kubenswrapper[4807]: E1202 20:20:11.153744 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b198e4-d58c-4c1e-b6eb-35ecdeb30773" containerName="mariadb-database-create" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.153752 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b198e4-d58c-4c1e-b6eb-35ecdeb30773" containerName="mariadb-database-create" Dec 02 20:20:11 crc kubenswrapper[4807]: E1202 20:20:11.153762 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b0a41b-3334-4181-bad4-fd5cf030b010" containerName="mariadb-account-create-update" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.153769 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b0a41b-3334-4181-bad4-fd5cf030b010" containerName="mariadb-account-create-update" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.153981 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f1e4f9-4739-44a2-9257-389f73e77dfa" containerName="mariadb-account-create-update" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.154004 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9b198e4-d58c-4c1e-b6eb-35ecdeb30773" containerName="mariadb-database-create" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.154019 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="53ba8115-bc84-45ea-804d-c29add38ee3a" containerName="glance-db-sync" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.154043 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa51800c-1f7b-4000-8547-c391f3a6dc6a" containerName="mariadb-database-create" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.154064 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="1141d1f9-6d7f-46fa-8e85-511b8a0adddf" containerName="mariadb-account-create-update" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.154077 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="323b935c-8f72-4514-96c0-6edea05c6498" containerName="mariadb-database-create" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.154088 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b0a41b-3334-4181-bad4-fd5cf030b010" containerName="mariadb-account-create-update" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.155518 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.167457 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-7qr5t"] Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.245102 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-config\") pod \"dnsmasq-dns-74dc88fc-7qr5t\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.245216 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwb9m\" (UniqueName: \"kubernetes.io/projected/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-kube-api-access-hwb9m\") pod \"dnsmasq-dns-74dc88fc-7qr5t\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.245255 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-7qr5t\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.245275 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-7qr5t\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.245327 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-dns-svc\") pod \"dnsmasq-dns-74dc88fc-7qr5t\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.346582 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-7qr5t\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.346636 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-7qr5t\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.346702 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-dns-svc\") pod \"dnsmasq-dns-74dc88fc-7qr5t\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.346744 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-config\") pod \"dnsmasq-dns-74dc88fc-7qr5t\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.346823 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwb9m\" (UniqueName: \"kubernetes.io/projected/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-kube-api-access-hwb9m\") pod \"dnsmasq-dns-74dc88fc-7qr5t\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.348064 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-ovsdbserver-sb\") pod \"dnsmasq-dns-74dc88fc-7qr5t\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.348064 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-config\") pod \"dnsmasq-dns-74dc88fc-7qr5t\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.348182 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-ovsdbserver-nb\") pod \"dnsmasq-dns-74dc88fc-7qr5t\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.348665 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-dns-svc\") pod \"dnsmasq-dns-74dc88fc-7qr5t\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.370065 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwb9m\" (UniqueName: \"kubernetes.io/projected/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-kube-api-access-hwb9m\") pod \"dnsmasq-dns-74dc88fc-7qr5t\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.382478 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"caaccc0b-6743-4907-9d87-f4ab26c931e2","Type":"ContainerStarted","Data":"c0da51acbc9298e487ed726451d1ebb2eee8b66f8fa140e2ab546e5051e55dfd"} Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.481956 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:11 crc kubenswrapper[4807]: I1202 20:20:11.805595 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-7qr5t"] Dec 02 20:20:11 crc kubenswrapper[4807]: W1202 20:20:11.814455 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f2b24f9_3d8a_4426_b664_fc1ebb488d9d.slice/crio-16151c1b91a65822ec6a62e943332e261ea0aff2858d92b89b17034f10257cfc WatchSource:0}: Error finding container 16151c1b91a65822ec6a62e943332e261ea0aff2858d92b89b17034f10257cfc: Status 404 returned error can't find the container with id 16151c1b91a65822ec6a62e943332e261ea0aff2858d92b89b17034f10257cfc Dec 02 20:20:12 crc kubenswrapper[4807]: I1202 20:20:12.404096 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" event={"ID":"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d","Type":"ContainerStarted","Data":"16151c1b91a65822ec6a62e943332e261ea0aff2858d92b89b17034f10257cfc"} Dec 02 20:20:14 crc kubenswrapper[4807]: I1202 20:20:14.428288 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a52102ed-4584-44cb-b051-4f08f862d64d","Type":"ContainerStarted","Data":"57df907f8d5ec55e94958ff44379306e20d406fcfae987fd7aa5e8b49ccc8e5b"} Dec 02 20:20:14 crc kubenswrapper[4807]: I1202 20:20:14.428868 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a52102ed-4584-44cb-b051-4f08f862d64d","Type":"ContainerStarted","Data":"a363a8c00fb72da0b549f2f69fa6ae74750dbced890a1d67e834fb32a98f2ad9"} Dec 02 20:20:14 crc kubenswrapper[4807]: I1202 20:20:14.430388 4807 generic.go:334] "Generic (PLEG): container finished" podID="1f2b24f9-3d8a-4426-b664-fc1ebb488d9d" containerID="3b918f0ec4719dbf7628ce7a7d4f4f05823795a4512433c7ef6fbe8f2c53e1ab" exitCode=0 Dec 02 20:20:14 crc kubenswrapper[4807]: I1202 20:20:14.430556 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" event={"ID":"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d","Type":"ContainerDied","Data":"3b918f0ec4719dbf7628ce7a7d4f4f05823795a4512433c7ef6fbe8f2c53e1ab"} Dec 02 20:20:14 crc kubenswrapper[4807]: I1202 20:20:14.464803 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=24.464774696 podStartE2EDuration="24.464774696s" podCreationTimestamp="2025-12-02 20:19:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:20:14.460662263 +0000 UTC m=+1349.761569758" watchObservedRunningTime="2025-12-02 20:20:14.464774696 +0000 UTC m=+1349.765682191" Dec 02 20:20:15 crc kubenswrapper[4807]: I1202 20:20:15.447528 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"caaccc0b-6743-4907-9d87-f4ab26c931e2","Type":"ContainerStarted","Data":"0ab17d07b47aa4cfeec5e97b18f7606a3fb2a30d0384a4ef52f95a7d41d927ba"} Dec 02 20:20:15 crc kubenswrapper[4807]: I1202 20:20:15.447814 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"caaccc0b-6743-4907-9d87-f4ab26c931e2","Type":"ContainerStarted","Data":"f924702aaad63dde5eeeeb1da64aa4034f2b2d8950d5e8cea3af4b6aafcb8e42"} Dec 02 20:20:15 crc kubenswrapper[4807]: I1202 20:20:15.447828 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"caaccc0b-6743-4907-9d87-f4ab26c931e2","Type":"ContainerStarted","Data":"cf80186c4f8aa571e0308cf19b412531f65f7a2e1aa7713f9841de5df5159dc9"} Dec 02 20:20:15 crc kubenswrapper[4807]: I1202 20:20:15.450955 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" event={"ID":"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d","Type":"ContainerStarted","Data":"c3c0926311cd59fead8aa752b834bf050f2e0d30673563e3d2b353c21c7fab3e"} Dec 02 20:20:15 crc kubenswrapper[4807]: I1202 20:20:15.482465 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" podStartSLOduration=4.482443112 podStartE2EDuration="4.482443112s" podCreationTimestamp="2025-12-02 20:20:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:20:15.47296557 +0000 UTC m=+1350.773873085" watchObservedRunningTime="2025-12-02 20:20:15.482443112 +0000 UTC m=+1350.783350607" Dec 02 20:20:16 crc kubenswrapper[4807]: I1202 20:20:16.399572 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 02 20:20:16 crc kubenswrapper[4807]: I1202 20:20:16.482180 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:16 crc kubenswrapper[4807]: I1202 20:20:16.494812 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"caaccc0b-6743-4907-9d87-f4ab26c931e2","Type":"ContainerStarted","Data":"65d0359e2399e672ec8f16b5ea1efa7df5e71889608b5dbb8e2ed6edc8b3e2ce"} Dec 02 20:20:16 crc kubenswrapper[4807]: I1202 20:20:16.494889 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"caaccc0b-6743-4907-9d87-f4ab26c931e2","Type":"ContainerStarted","Data":"5619f6e2068b657af1f7eacdb0a40897d9a5ef97d84fffeca8e155ccee3357b3"} Dec 02 20:20:16 crc kubenswrapper[4807]: I1202 20:20:16.494900 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"caaccc0b-6743-4907-9d87-f4ab26c931e2","Type":"ContainerStarted","Data":"e607ffc818e802b91f4d602301817377d90b56605865b4bb7eab136b3ae8c795"} Dec 02 20:20:17 crc kubenswrapper[4807]: I1202 20:20:17.513125 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"caaccc0b-6743-4907-9d87-f4ab26c931e2","Type":"ContainerStarted","Data":"04bead25136026582baeb0cb29a607ed5eff0a27b53422bc34e1a9e775683b66"} Dec 02 20:20:17 crc kubenswrapper[4807]: I1202 20:20:17.562426 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=42.571378183 podStartE2EDuration="1m7.562393172s" podCreationTimestamp="2025-12-02 20:19:10 +0000 UTC" firstStartedPulling="2025-12-02 20:19:49.643889895 +0000 UTC m=+1324.944797390" lastFinishedPulling="2025-12-02 20:20:14.634904874 +0000 UTC m=+1349.935812379" observedRunningTime="2025-12-02 20:20:17.559363672 +0000 UTC m=+1352.860271167" watchObservedRunningTime="2025-12-02 20:20:17.562393172 +0000 UTC m=+1352.863300677" Dec 02 20:20:17 crc kubenswrapper[4807]: I1202 20:20:17.909008 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-7qr5t"] Dec 02 20:20:17 crc kubenswrapper[4807]: I1202 20:20:17.988887 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-gdjkw"] Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.004609 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.010441 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-gdjkw"] Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.018939 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.206006 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9b8j\" (UniqueName: \"kubernetes.io/projected/20d4ea86-b339-4793-b0da-4127cf6d045f-kube-api-access-b9b8j\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.206487 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.206530 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-config\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.206559 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.206632 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.206788 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.308986 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.309078 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9b8j\" (UniqueName: \"kubernetes.io/projected/20d4ea86-b339-4793-b0da-4127cf6d045f-kube-api-access-b9b8j\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.309159 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.309203 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-config\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.309234 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.309264 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.310251 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.310800 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.311710 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.312304 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-config\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.312869 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.340300 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9b8j\" (UniqueName: \"kubernetes.io/projected/20d4ea86-b339-4793-b0da-4127cf6d045f-kube-api-access-b9b8j\") pod \"dnsmasq-dns-5f59b8f679-gdjkw\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.377441 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.527274 4807 generic.go:334] "Generic (PLEG): container finished" podID="103c33fd-ac95-4efe-b1c2-b6c9187eb613" containerID="c6f655e62bc881fdb84a50db51f44c7de530390cd4b3965f1d2c5e41583e4ac4" exitCode=0 Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.527385 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9qc25" event={"ID":"103c33fd-ac95-4efe-b1c2-b6c9187eb613","Type":"ContainerDied","Data":"c6f655e62bc881fdb84a50db51f44c7de530390cd4b3965f1d2c5e41583e4ac4"} Dec 02 20:20:18 crc kubenswrapper[4807]: I1202 20:20:18.528906 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" podUID="1f2b24f9-3d8a-4426-b664-fc1ebb488d9d" containerName="dnsmasq-dns" containerID="cri-o://c3c0926311cd59fead8aa752b834bf050f2e0d30673563e3d2b353c21c7fab3e" gracePeriod=10 Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:18.969748 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-gdjkw"] Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.539343 4807 generic.go:334] "Generic (PLEG): container finished" podID="20d4ea86-b339-4793-b0da-4127cf6d045f" containerID="2516495c83b694e47104d88bf1614d67ae5f9238880664e6695a3cd4be6ff1d3" exitCode=0 Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.539663 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" event={"ID":"20d4ea86-b339-4793-b0da-4127cf6d045f","Type":"ContainerDied","Data":"2516495c83b694e47104d88bf1614d67ae5f9238880664e6695a3cd4be6ff1d3"} Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.540016 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" event={"ID":"20d4ea86-b339-4793-b0da-4127cf6d045f","Type":"ContainerStarted","Data":"ed8240789644251999b20011904bf3defa7b444ae7f360c3e48cbec4776b14df"} Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.548674 4807 generic.go:334] "Generic (PLEG): container finished" podID="1f2b24f9-3d8a-4426-b664-fc1ebb488d9d" containerID="c3c0926311cd59fead8aa752b834bf050f2e0d30673563e3d2b353c21c7fab3e" exitCode=0 Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.548886 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" event={"ID":"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d","Type":"ContainerDied","Data":"c3c0926311cd59fead8aa752b834bf050f2e0d30673563e3d2b353c21c7fab3e"} Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.548940 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" event={"ID":"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d","Type":"ContainerDied","Data":"16151c1b91a65822ec6a62e943332e261ea0aff2858d92b89b17034f10257cfc"} Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.548956 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16151c1b91a65822ec6a62e943332e261ea0aff2858d92b89b17034f10257cfc" Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.709317 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.847129 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-ovsdbserver-nb\") pod \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.848225 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwb9m\" (UniqueName: \"kubernetes.io/projected/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-kube-api-access-hwb9m\") pod \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.848302 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-ovsdbserver-sb\") pod \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.848362 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-dns-svc\") pod \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.848399 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-config\") pod \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\" (UID: \"1f2b24f9-3d8a-4426-b664-fc1ebb488d9d\") " Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.855333 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-kube-api-access-hwb9m" (OuterVolumeSpecName: "kube-api-access-hwb9m") pod "1f2b24f9-3d8a-4426-b664-fc1ebb488d9d" (UID: "1f2b24f9-3d8a-4426-b664-fc1ebb488d9d"). InnerVolumeSpecName "kube-api-access-hwb9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.916263 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9qc25" Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.934337 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1f2b24f9-3d8a-4426-b664-fc1ebb488d9d" (UID: "1f2b24f9-3d8a-4426-b664-fc1ebb488d9d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.941749 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-config" (OuterVolumeSpecName: "config") pod "1f2b24f9-3d8a-4426-b664-fc1ebb488d9d" (UID: "1f2b24f9-3d8a-4426-b664-fc1ebb488d9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.951112 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1f2b24f9-3d8a-4426-b664-fc1ebb488d9d" (UID: "1f2b24f9-3d8a-4426-b664-fc1ebb488d9d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.951174 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwb9m\" (UniqueName: \"kubernetes.io/projected/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-kube-api-access-hwb9m\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.951196 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.951211 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:19 crc kubenswrapper[4807]: I1202 20:20:19.958263 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f2b24f9-3d8a-4426-b664-fc1ebb488d9d" (UID: "1f2b24f9-3d8a-4426-b664-fc1ebb488d9d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.052804 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103c33fd-ac95-4efe-b1c2-b6c9187eb613-config-data\") pod \"103c33fd-ac95-4efe-b1c2-b6c9187eb613\" (UID: \"103c33fd-ac95-4efe-b1c2-b6c9187eb613\") " Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.053243 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkk29\" (UniqueName: \"kubernetes.io/projected/103c33fd-ac95-4efe-b1c2-b6c9187eb613-kube-api-access-xkk29\") pod \"103c33fd-ac95-4efe-b1c2-b6c9187eb613\" (UID: \"103c33fd-ac95-4efe-b1c2-b6c9187eb613\") " Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.053511 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103c33fd-ac95-4efe-b1c2-b6c9187eb613-combined-ca-bundle\") pod \"103c33fd-ac95-4efe-b1c2-b6c9187eb613\" (UID: \"103c33fd-ac95-4efe-b1c2-b6c9187eb613\") " Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.054092 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.054162 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.061917 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/103c33fd-ac95-4efe-b1c2-b6c9187eb613-kube-api-access-xkk29" (OuterVolumeSpecName: "kube-api-access-xkk29") pod "103c33fd-ac95-4efe-b1c2-b6c9187eb613" (UID: "103c33fd-ac95-4efe-b1c2-b6c9187eb613"). InnerVolumeSpecName "kube-api-access-xkk29". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.084490 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103c33fd-ac95-4efe-b1c2-b6c9187eb613-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "103c33fd-ac95-4efe-b1c2-b6c9187eb613" (UID: "103c33fd-ac95-4efe-b1c2-b6c9187eb613"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.119665 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103c33fd-ac95-4efe-b1c2-b6c9187eb613-config-data" (OuterVolumeSpecName: "config-data") pod "103c33fd-ac95-4efe-b1c2-b6c9187eb613" (UID: "103c33fd-ac95-4efe-b1c2-b6c9187eb613"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.156056 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103c33fd-ac95-4efe-b1c2-b6c9187eb613-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.156126 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkk29\" (UniqueName: \"kubernetes.io/projected/103c33fd-ac95-4efe-b1c2-b6c9187eb613-kube-api-access-xkk29\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.156142 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103c33fd-ac95-4efe-b1c2-b6c9187eb613-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.561979 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9qc25" event={"ID":"103c33fd-ac95-4efe-b1c2-b6c9187eb613","Type":"ContainerDied","Data":"c8b9fdc8d4c172b8c2fedef97bf44709d11827e6de9adc27a2993d488088b0b8"} Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.562059 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9qc25" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.562447 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8b9fdc8d4c172b8c2fedef97bf44709d11827e6de9adc27a2993d488088b0b8" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.565113 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74dc88fc-7qr5t" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.566568 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" event={"ID":"20d4ea86-b339-4793-b0da-4127cf6d045f","Type":"ContainerStarted","Data":"a378f809961fff3cfabae21d98185da91781c0706db8a3e15944137a826b40ad"} Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.566631 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.614340 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" podStartSLOduration=3.614311967 podStartE2EDuration="3.614311967s" podCreationTimestamp="2025-12-02 20:20:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:20:20.602263128 +0000 UTC m=+1355.903170633" watchObservedRunningTime="2025-12-02 20:20:20.614311967 +0000 UTC m=+1355.915219462" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.642295 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-7qr5t"] Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.649501 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74dc88fc-7qr5t"] Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.851014 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-skwsf"] Dec 02 20:20:20 crc kubenswrapper[4807]: E1202 20:20:20.851457 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2b24f9-3d8a-4426-b664-fc1ebb488d9d" containerName="init" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.851475 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2b24f9-3d8a-4426-b664-fc1ebb488d9d" containerName="init" Dec 02 20:20:20 crc kubenswrapper[4807]: E1202 20:20:20.851497 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2b24f9-3d8a-4426-b664-fc1ebb488d9d" containerName="dnsmasq-dns" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.851509 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2b24f9-3d8a-4426-b664-fc1ebb488d9d" containerName="dnsmasq-dns" Dec 02 20:20:20 crc kubenswrapper[4807]: E1202 20:20:20.851528 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103c33fd-ac95-4efe-b1c2-b6c9187eb613" containerName="keystone-db-sync" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.851534 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="103c33fd-ac95-4efe-b1c2-b6c9187eb613" containerName="keystone-db-sync" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.851702 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="103c33fd-ac95-4efe-b1c2-b6c9187eb613" containerName="keystone-db-sync" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.855893 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f2b24f9-3d8a-4426-b664-fc1ebb488d9d" containerName="dnsmasq-dns" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.857084 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.860988 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-psn74" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.867439 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.867596 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.867618 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.867896 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.870704 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-gdjkw"] Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.879514 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-skwsf"] Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.973525 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-scripts\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.973571 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-combined-ca-bundle\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.973599 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-config-data\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.973791 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-fernet-keys\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.973872 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-credential-keys\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:20 crc kubenswrapper[4807]: I1202 20:20:20.973932 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n2zt\" (UniqueName: \"kubernetes.io/projected/80aba33b-991b-4849-8a5b-cba332a74c91-kube-api-access-8n2zt\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.018280 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f2b24f9-3d8a-4426-b664-fc1ebb488d9d" path="/var/lib/kubelet/pods/1f2b24f9-3d8a-4426-b664-fc1ebb488d9d/volumes" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.076615 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-fernet-keys\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.076795 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-credential-keys\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.076842 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n2zt\" (UniqueName: \"kubernetes.io/projected/80aba33b-991b-4849-8a5b-cba332a74c91-kube-api-access-8n2zt\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.076943 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-combined-ca-bundle\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.076972 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-scripts\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.077005 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-config-data\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.101160 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-fernet-keys\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.101273 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-config-data\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.115594 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-scripts\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.115741 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-tkqcq"] Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.118218 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.132382 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-credential-keys\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.134399 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-combined-ca-bundle\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.177237 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n2zt\" (UniqueName: \"kubernetes.io/projected/80aba33b-991b-4849-8a5b-cba332a74c91-kube-api-access-8n2zt\") pod \"keystone-bootstrap-skwsf\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.207153 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.282114 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.282173 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plqhx\" (UniqueName: \"kubernetes.io/projected/241317a5-551d-4833-8453-cab96354663b-kube-api-access-plqhx\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.282231 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-config\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.282261 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.282311 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.282364 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.408020 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.408121 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.408228 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.408275 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plqhx\" (UniqueName: \"kubernetes.io/projected/241317a5-551d-4833-8453-cab96354663b-kube-api-access-plqhx\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.408297 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-config\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.408317 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.409344 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.409892 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.410335 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.410711 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.411032 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.412397 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-config\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.427844 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.495939 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plqhx\" (UniqueName: \"kubernetes.io/projected/241317a5-551d-4833-8453-cab96354663b-kube-api-access-plqhx\") pod \"dnsmasq-dns-bbf5cc879-tkqcq\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.588916 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.600927 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.673377 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.702598 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.728446 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.728708 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.734824 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-rp8rr"] Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.736386 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rp8rr" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.747778 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.793273 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-75b4587fb9-xsql7"] Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.795559 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.811222 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xw24n" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.812426 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.812700 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.830816 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.830879 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72652f32-e037-48c8-850e-fb193ad32c5a-logs\") pod \"horizon-75b4587fb9-xsql7\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.830903 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/372e15f7-0eed-4b38-b4a9-19d3781e6e89-config\") pod \"neutron-db-sync-rp8rr\" (UID: \"372e15f7-0eed-4b38-b4a9-19d3781e6e89\") " pod="openstack/neutron-db-sync-rp8rr" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.830922 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w45f2\" (UniqueName: \"kubernetes.io/projected/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-kube-api-access-w45f2\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.830938 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/72652f32-e037-48c8-850e-fb193ad32c5a-horizon-secret-key\") pod \"horizon-75b4587fb9-xsql7\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.831017 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-log-httpd\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.831065 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-run-httpd\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.831091 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-config-data\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.831112 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.831144 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72652f32-e037-48c8-850e-fb193ad32c5a-config-data\") pod \"horizon-75b4587fb9-xsql7\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.831172 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-scripts\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.831203 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvdkk\" (UniqueName: \"kubernetes.io/projected/72652f32-e037-48c8-850e-fb193ad32c5a-kube-api-access-zvdkk\") pod \"horizon-75b4587fb9-xsql7\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.831245 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/372e15f7-0eed-4b38-b4a9-19d3781e6e89-combined-ca-bundle\") pod \"neutron-db-sync-rp8rr\" (UID: \"372e15f7-0eed-4b38-b4a9-19d3781e6e89\") " pod="openstack/neutron-db-sync-rp8rr" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.831293 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx4v7\" (UniqueName: \"kubernetes.io/projected/372e15f7-0eed-4b38-b4a9-19d3781e6e89-kube-api-access-jx4v7\") pod \"neutron-db-sync-rp8rr\" (UID: \"372e15f7-0eed-4b38-b4a9-19d3781e6e89\") " pod="openstack/neutron-db-sync-rp8rr" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.831327 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72652f32-e037-48c8-850e-fb193ad32c5a-scripts\") pod \"horizon-75b4587fb9-xsql7\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.841370 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.841612 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.841793 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.841930 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-h5cxm" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.846473 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-zlf57"] Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.847912 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.874345 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.874655 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.874850 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mc9wl" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.922812 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rp8rr"] Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.933599 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zfcg\" (UniqueName: \"kubernetes.io/projected/2b4b9175-26ae-4cff-8dd2-7682b1408271-kube-api-access-7zfcg\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.933662 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-config-data\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.933689 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.933765 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72652f32-e037-48c8-850e-fb193ad32c5a-config-data\") pod \"horizon-75b4587fb9-xsql7\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.933787 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-scripts\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.933841 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvdkk\" (UniqueName: \"kubernetes.io/projected/72652f32-e037-48c8-850e-fb193ad32c5a-kube-api-access-zvdkk\") pod \"horizon-75b4587fb9-xsql7\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.933868 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/372e15f7-0eed-4b38-b4a9-19d3781e6e89-combined-ca-bundle\") pod \"neutron-db-sync-rp8rr\" (UID: \"372e15f7-0eed-4b38-b4a9-19d3781e6e89\") " pod="openstack/neutron-db-sync-rp8rr" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.933898 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx4v7\" (UniqueName: \"kubernetes.io/projected/372e15f7-0eed-4b38-b4a9-19d3781e6e89-kube-api-access-jx4v7\") pod \"neutron-db-sync-rp8rr\" (UID: \"372e15f7-0eed-4b38-b4a9-19d3781e6e89\") " pod="openstack/neutron-db-sync-rp8rr" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.933921 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72652f32-e037-48c8-850e-fb193ad32c5a-scripts\") pod \"horizon-75b4587fb9-xsql7\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.933955 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.933982 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2b4b9175-26ae-4cff-8dd2-7682b1408271-etc-machine-id\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.934005 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72652f32-e037-48c8-850e-fb193ad32c5a-logs\") pod \"horizon-75b4587fb9-xsql7\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.934028 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/372e15f7-0eed-4b38-b4a9-19d3781e6e89-config\") pod \"neutron-db-sync-rp8rr\" (UID: \"372e15f7-0eed-4b38-b4a9-19d3781e6e89\") " pod="openstack/neutron-db-sync-rp8rr" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.934051 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w45f2\" (UniqueName: \"kubernetes.io/projected/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-kube-api-access-w45f2\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.934071 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/72652f32-e037-48c8-850e-fb193ad32c5a-horizon-secret-key\") pod \"horizon-75b4587fb9-xsql7\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.934110 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-config-data\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.934140 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-scripts\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.934167 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-db-sync-config-data\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.934192 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-log-httpd\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.934212 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-combined-ca-bundle\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.934244 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-run-httpd\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.934884 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-run-httpd\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.936626 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72652f32-e037-48c8-850e-fb193ad32c5a-config-data\") pod \"horizon-75b4587fb9-xsql7\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.941369 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-log-httpd\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.943325 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72652f32-e037-48c8-850e-fb193ad32c5a-scripts\") pod \"horizon-75b4587fb9-xsql7\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.947799 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72652f32-e037-48c8-850e-fb193ad32c5a-logs\") pod \"horizon-75b4587fb9-xsql7\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.953755 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/372e15f7-0eed-4b38-b4a9-19d3781e6e89-config\") pod \"neutron-db-sync-rp8rr\" (UID: \"372e15f7-0eed-4b38-b4a9-19d3781e6e89\") " pod="openstack/neutron-db-sync-rp8rr" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.958904 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zlf57"] Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.962094 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-scripts\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.964692 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/72652f32-e037-48c8-850e-fb193ad32c5a-horizon-secret-key\") pod \"horizon-75b4587fb9-xsql7\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.971240 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.981797 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75b4587fb9-xsql7"] Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.990003 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/372e15f7-0eed-4b38-b4a9-19d3781e6e89-combined-ca-bundle\") pod \"neutron-db-sync-rp8rr\" (UID: \"372e15f7-0eed-4b38-b4a9-19d3781e6e89\") " pod="openstack/neutron-db-sync-rp8rr" Dec 02 20:20:21 crc kubenswrapper[4807]: I1202 20:20:21.990041 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx4v7\" (UniqueName: \"kubernetes.io/projected/372e15f7-0eed-4b38-b4a9-19d3781e6e89-kube-api-access-jx4v7\") pod \"neutron-db-sync-rp8rr\" (UID: \"372e15f7-0eed-4b38-b4a9-19d3781e6e89\") " pod="openstack/neutron-db-sync-rp8rr" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.006647 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rp8rr" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.010663 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-config-data\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.028370 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvdkk\" (UniqueName: \"kubernetes.io/projected/72652f32-e037-48c8-850e-fb193ad32c5a-kube-api-access-zvdkk\") pod \"horizon-75b4587fb9-xsql7\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.028562 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.033218 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w45f2\" (UniqueName: \"kubernetes.io/projected/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-kube-api-access-w45f2\") pod \"ceilometer-0\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " pod="openstack/ceilometer-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.038662 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2b4b9175-26ae-4cff-8dd2-7682b1408271-etc-machine-id\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.038868 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-config-data\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.038988 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-scripts\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.039098 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-db-sync-config-data\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.039179 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-combined-ca-bundle\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.039288 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zfcg\" (UniqueName: \"kubernetes.io/projected/2b4b9175-26ae-4cff-8dd2-7682b1408271-kube-api-access-7zfcg\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.040630 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2b4b9175-26ae-4cff-8dd2-7682b1408271-etc-machine-id\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.065209 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.132669 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-config-data\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.144855 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-scripts\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.149307 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-tgqtm"] Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.150670 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.157975 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.158188 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.158316 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-mh454" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.159150 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-combined-ca-bundle\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.163325 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-db-sync-config-data\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.194963 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7dbb67c58f-k7fxg"] Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.196599 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.198824 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zfcg\" (UniqueName: \"kubernetes.io/projected/2b4b9175-26ae-4cff-8dd2-7682b1408271-kube-api-access-7zfcg\") pod \"cinder-db-sync-zlf57\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.221475 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-tgqtm"] Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.243012 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.273399 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-tkqcq"] Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.326491 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7dbb67c58f-k7fxg"] Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.349457 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgtth\" (UniqueName: \"kubernetes.io/projected/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-kube-api-access-fgtth\") pod \"horizon-7dbb67c58f-k7fxg\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.349557 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-config-data\") pod \"horizon-7dbb67c58f-k7fxg\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.349584 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-combined-ca-bundle\") pod \"placement-db-sync-tgqtm\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.349609 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-scripts\") pod \"horizon-7dbb67c58f-k7fxg\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.349644 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-config-data\") pod \"placement-db-sync-tgqtm\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.349689 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bv64\" (UniqueName: \"kubernetes.io/projected/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-kube-api-access-5bv64\") pod \"placement-db-sync-tgqtm\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.349749 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-scripts\") pod \"placement-db-sync-tgqtm\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.349789 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-horizon-secret-key\") pod \"horizon-7dbb67c58f-k7fxg\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.349808 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-logs\") pod \"horizon-7dbb67c58f-k7fxg\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.349993 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-logs\") pod \"placement-db-sync-tgqtm\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.374562 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-bppsk"] Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.375981 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bppsk" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.379251 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-j9mxl" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.385986 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.387099 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zlf57" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.502264 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-logs\") pod \"placement-db-sync-tgqtm\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.502544 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgtth\" (UniqueName: \"kubernetes.io/projected/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-kube-api-access-fgtth\") pod \"horizon-7dbb67c58f-k7fxg\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.502606 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b83e24a-ce7d-42b1-998f-1ede619914ff-combined-ca-bundle\") pod \"barbican-db-sync-bppsk\" (UID: \"9b83e24a-ce7d-42b1-998f-1ede619914ff\") " pod="openstack/barbican-db-sync-bppsk" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.502691 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zvj5\" (UniqueName: \"kubernetes.io/projected/9b83e24a-ce7d-42b1-998f-1ede619914ff-kube-api-access-2zvj5\") pod \"barbican-db-sync-bppsk\" (UID: \"9b83e24a-ce7d-42b1-998f-1ede619914ff\") " pod="openstack/barbican-db-sync-bppsk" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.502734 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-config-data\") pod \"horizon-7dbb67c58f-k7fxg\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.502758 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-combined-ca-bundle\") pod \"placement-db-sync-tgqtm\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.502847 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-scripts\") pod \"horizon-7dbb67c58f-k7fxg\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.502954 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-config-data\") pod \"placement-db-sync-tgqtm\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.503038 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bv64\" (UniqueName: \"kubernetes.io/projected/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-kube-api-access-5bv64\") pod \"placement-db-sync-tgqtm\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.503093 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-scripts\") pod \"placement-db-sync-tgqtm\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.503195 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b83e24a-ce7d-42b1-998f-1ede619914ff-db-sync-config-data\") pod \"barbican-db-sync-bppsk\" (UID: \"9b83e24a-ce7d-42b1-998f-1ede619914ff\") " pod="openstack/barbican-db-sync-bppsk" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.503241 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-horizon-secret-key\") pod \"horizon-7dbb67c58f-k7fxg\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.503276 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-logs\") pod \"horizon-7dbb67c58f-k7fxg\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.504667 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-logs\") pod \"horizon-7dbb67c58f-k7fxg\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.505070 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-logs\") pod \"placement-db-sync-tgqtm\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.506543 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-config-data\") pod \"horizon-7dbb67c58f-k7fxg\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.521588 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-scripts\") pod \"horizon-7dbb67c58f-k7fxg\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.533506 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-horizon-secret-key\") pod \"horizon-7dbb67c58f-k7fxg\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.537741 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-combined-ca-bundle\") pod \"placement-db-sync-tgqtm\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.558749 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bppsk"] Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.563470 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgtth\" (UniqueName: \"kubernetes.io/projected/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-kube-api-access-fgtth\") pod \"horizon-7dbb67c58f-k7fxg\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.564447 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-config-data\") pod \"placement-db-sync-tgqtm\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.580584 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bv64\" (UniqueName: \"kubernetes.io/projected/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-kube-api-access-5bv64\") pod \"placement-db-sync-tgqtm\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.580998 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-scripts\") pod \"placement-db-sync-tgqtm\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.606889 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b83e24a-ce7d-42b1-998f-1ede619914ff-combined-ca-bundle\") pod \"barbican-db-sync-bppsk\" (UID: \"9b83e24a-ce7d-42b1-998f-1ede619914ff\") " pod="openstack/barbican-db-sync-bppsk" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.607410 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zvj5\" (UniqueName: \"kubernetes.io/projected/9b83e24a-ce7d-42b1-998f-1ede619914ff-kube-api-access-2zvj5\") pod \"barbican-db-sync-bppsk\" (UID: \"9b83e24a-ce7d-42b1-998f-1ede619914ff\") " pod="openstack/barbican-db-sync-bppsk" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.607499 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b83e24a-ce7d-42b1-998f-1ede619914ff-db-sync-config-data\") pod \"barbican-db-sync-bppsk\" (UID: \"9b83e24a-ce7d-42b1-998f-1ede619914ff\") " pod="openstack/barbican-db-sync-bppsk" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.645519 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b83e24a-ce7d-42b1-998f-1ede619914ff-db-sync-config-data\") pod \"barbican-db-sync-bppsk\" (UID: \"9b83e24a-ce7d-42b1-998f-1ede619914ff\") " pod="openstack/barbican-db-sync-bppsk" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.648602 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b83e24a-ce7d-42b1-998f-1ede619914ff-combined-ca-bundle\") pod \"barbican-db-sync-bppsk\" (UID: \"9b83e24a-ce7d-42b1-998f-1ede619914ff\") " pod="openstack/barbican-db-sync-bppsk" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.673251 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.674457 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-vsnlc" event={"ID":"5e35946b-d565-4354-9c86-8eb06b4ed154","Type":"ContainerStarted","Data":"89f4e947d7dd0ac185fabb253543b0ee6fc90bf5bf9fc4172e826a69c4f7f044"} Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.708689 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.713077 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" podUID="20d4ea86-b339-4793-b0da-4127cf6d045f" containerName="dnsmasq-dns" containerID="cri-o://a378f809961fff3cfabae21d98185da91781c0706db8a3e15944137a826b40ad" gracePeriod=10 Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.721587 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-skwsf" event={"ID":"80aba33b-991b-4849-8a5b-cba332a74c91","Type":"ContainerStarted","Data":"5d7a049585863216a6ce5ece1e936680e272e510e9c531f0d1897ea84eaec501"} Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.721735 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.730332 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zvj5\" (UniqueName: \"kubernetes.io/projected/9b83e24a-ce7d-42b1-998f-1ede619914ff-kube-api-access-2zvj5\") pod \"barbican-db-sync-bppsk\" (UID: \"9b83e24a-ce7d-42b1-998f-1ede619914ff\") " pod="openstack/barbican-db-sync-bppsk" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.741228 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jtsp2" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.744400 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.744576 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.745070 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.772690 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.815609 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tgqtm" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.827952 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-tkqcq"] Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.845257 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.845382 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.845403 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.850251 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v74l\" (UniqueName: \"kubernetes.io/projected/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-kube-api-access-7v74l\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.850373 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.850468 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-scripts\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.850539 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-logs\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.850565 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-config-data\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.897841 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-skwsf"] Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.947735 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-ln9hs"] Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.949396 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.953679 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.953704 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.953744 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v74l\" (UniqueName: \"kubernetes.io/projected/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-kube-api-access-7v74l\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.953774 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.953802 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-scripts\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.955856 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.962476 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-logs\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.962519 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-config-data\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.962633 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.965393 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-logs\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.970329 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.971161 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:22 crc kubenswrapper[4807]: I1202 20:20:22.992835 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-config-data\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.006925 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.010332 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v74l\" (UniqueName: \"kubernetes.io/projected/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-kube-api-access-7v74l\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.028113 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.034537 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-ln9hs"] Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.034694 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.035902 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-scripts\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.036004 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.038074 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.038742 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bppsk" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.039174 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-vsnlc" podStartSLOduration=6.665452086 podStartE2EDuration="38.039145782s" podCreationTimestamp="2025-12-02 20:19:45 +0000 UTC" firstStartedPulling="2025-12-02 20:19:49.722749424 +0000 UTC m=+1325.023656919" lastFinishedPulling="2025-12-02 20:20:21.09644312 +0000 UTC m=+1356.397350615" observedRunningTime="2025-12-02 20:20:22.740870607 +0000 UTC m=+1358.041778092" watchObservedRunningTime="2025-12-02 20:20:23.039145782 +0000 UTC m=+1358.340053277" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.042305 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.078388 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.080048 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.080074 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.080114 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-config\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.080161 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz8pc\" (UniqueName: \"kubernetes.io/projected/51d28406-b887-4c62-8b1c-f7005f6ee3c0-kube-api-access-hz8pc\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.080289 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.095342 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.101467 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-tkqcq"] Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.183324 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz8pc\" (UniqueName: \"kubernetes.io/projected/51d28406-b887-4c62-8b1c-f7005f6ee3c0-kube-api-access-hz8pc\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.183428 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.183477 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqk2h\" (UniqueName: \"kubernetes.io/projected/7a24c2bd-52fb-46be-9df9-f584530efc32-kube-api-access-nqk2h\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.183542 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.183593 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.183627 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.183657 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.183690 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.183751 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.183787 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.183805 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.183827 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-config\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.183849 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a24c2bd-52fb-46be-9df9-f584530efc32-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.183883 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a24c2bd-52fb-46be-9df9-f584530efc32-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.185088 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.185602 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-config\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.185759 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.186848 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.187027 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.194155 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.202462 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz8pc\" (UniqueName: \"kubernetes.io/projected/51d28406-b887-4c62-8b1c-f7005f6ee3c0-kube-api-access-hz8pc\") pod \"dnsmasq-dns-56df8fb6b7-ln9hs\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.286874 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.286948 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.287261 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.287392 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a24c2bd-52fb-46be-9df9-f584530efc32-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.287465 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a24c2bd-52fb-46be-9df9-f584530efc32-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.287582 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.287626 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqk2h\" (UniqueName: \"kubernetes.io/projected/7a24c2bd-52fb-46be-9df9-f584530efc32-kube-api-access-nqk2h\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.287753 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.288417 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a24c2bd-52fb-46be-9df9-f584530efc32-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.292701 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.293081 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.295128 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a24c2bd-52fb-46be-9df9-f584530efc32-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.307114 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.307790 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.309426 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.314700 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.322276 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqk2h\" (UniqueName: \"kubernetes.io/projected/7a24c2bd-52fb-46be-9df9-f584530efc32-kube-api-access-nqk2h\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.353269 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.394927 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.538868 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75b4587fb9-xsql7"] Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.573561 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rp8rr"] Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.789800 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rp8rr" event={"ID":"372e15f7-0eed-4b38-b4a9-19d3781e6e89","Type":"ContainerStarted","Data":"6ba0c38a41f65c828651a00d7769697b4e85ca2fb3a44a56595b1d6dd6fd41a2"} Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.806867 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.831322 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-skwsf" event={"ID":"80aba33b-991b-4849-8a5b-cba332a74c91","Type":"ContainerStarted","Data":"4742c9b21a7f9c5e7eae76e27daec65c71239088aaa0cd26d31987b735ceb989"} Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.913651 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-skwsf" podStartSLOduration=3.913618332 podStartE2EDuration="3.913618332s" podCreationTimestamp="2025-12-02 20:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:20:23.892658967 +0000 UTC m=+1359.193566462" watchObservedRunningTime="2025-12-02 20:20:23.913618332 +0000 UTC m=+1359.214525827" Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.949408 4807 generic.go:334] "Generic (PLEG): container finished" podID="20d4ea86-b339-4793-b0da-4127cf6d045f" containerID="a378f809961fff3cfabae21d98185da91781c0706db8a3e15944137a826b40ad" exitCode=0 Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.949551 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" event={"ID":"20d4ea86-b339-4793-b0da-4127cf6d045f","Type":"ContainerDied","Data":"a378f809961fff3cfabae21d98185da91781c0706db8a3e15944137a826b40ad"} Dec 02 20:20:23 crc kubenswrapper[4807]: I1202 20:20:23.988830 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75b4587fb9-xsql7" event={"ID":"72652f32-e037-48c8-850e-fb193ad32c5a","Type":"ContainerStarted","Data":"95871a1d60e36fcabbdbab5c33ecd950341a88b52f4bc12f64c1915feb4d1a2c"} Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.000736 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zlf57"] Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.008466 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" event={"ID":"241317a5-551d-4833-8453-cab96354663b","Type":"ContainerStarted","Data":"c814c6475836a467cf413a14c6ad55b21f171cb4ddc083711bca8afba1db0350"} Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.037171 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.151772 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-dns-svc\") pod \"20d4ea86-b339-4793-b0da-4127cf6d045f\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.151867 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-ovsdbserver-nb\") pod \"20d4ea86-b339-4793-b0da-4127cf6d045f\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.151969 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-dns-swift-storage-0\") pod \"20d4ea86-b339-4793-b0da-4127cf6d045f\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.152058 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-ovsdbserver-sb\") pod \"20d4ea86-b339-4793-b0da-4127cf6d045f\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.152208 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-config\") pod \"20d4ea86-b339-4793-b0da-4127cf6d045f\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.152362 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9b8j\" (UniqueName: \"kubernetes.io/projected/20d4ea86-b339-4793-b0da-4127cf6d045f-kube-api-access-b9b8j\") pod \"20d4ea86-b339-4793-b0da-4127cf6d045f\" (UID: \"20d4ea86-b339-4793-b0da-4127cf6d045f\") " Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.264368 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20d4ea86-b339-4793-b0da-4127cf6d045f-kube-api-access-b9b8j" (OuterVolumeSpecName: "kube-api-access-b9b8j") pod "20d4ea86-b339-4793-b0da-4127cf6d045f" (UID: "20d4ea86-b339-4793-b0da-4127cf6d045f"). InnerVolumeSpecName "kube-api-access-b9b8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.365365 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9b8j\" (UniqueName: \"kubernetes.io/projected/20d4ea86-b339-4793-b0da-4127cf6d045f-kube-api-access-b9b8j\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.390397 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-tgqtm"] Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.432239 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7dbb67c58f-k7fxg"] Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.473988 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "20d4ea86-b339-4793-b0da-4127cf6d045f" (UID: "20d4ea86-b339-4793-b0da-4127cf6d045f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.577241 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.631960 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "20d4ea86-b339-4793-b0da-4127cf6d045f" (UID: "20d4ea86-b339-4793-b0da-4127cf6d045f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.639427 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-config" (OuterVolumeSpecName: "config") pod "20d4ea86-b339-4793-b0da-4127cf6d045f" (UID: "20d4ea86-b339-4793-b0da-4127cf6d045f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.648927 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "20d4ea86-b339-4793-b0da-4127cf6d045f" (UID: "20d4ea86-b339-4793-b0da-4127cf6d045f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.679294 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.692910 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.692934 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.688579 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "20d4ea86-b339-4793-b0da-4127cf6d045f" (UID: "20d4ea86-b339-4793-b0da-4127cf6d045f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.778644 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bppsk"] Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.795534 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20d4ea86-b339-4793-b0da-4127cf6d045f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:24 crc kubenswrapper[4807]: I1202 20:20:24.858681 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-ln9hs"] Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.041366 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.056884 4807 generic.go:334] "Generic (PLEG): container finished" podID="241317a5-551d-4833-8453-cab96354663b" containerID="9ea9e38b72b896906b73848260b501f614168b7dcec2bc1bc4dde2dcee109b19" exitCode=0 Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.056991 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" event={"ID":"241317a5-551d-4833-8453-cab96354663b","Type":"ContainerDied","Data":"9ea9e38b72b896906b73848260b501f614168b7dcec2bc1bc4dde2dcee109b19"} Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.088806 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fc7d360-e78a-4042-98e1-f8a2b97c10ab","Type":"ContainerStarted","Data":"c1b82c1c5a726dbfa4a4c28f9e6106afb8f5e0df950d1b4d2b5145fec07a8a46"} Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.094932 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tgqtm" event={"ID":"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe","Type":"ContainerStarted","Data":"8ab2770ddd8a6d4b5e272d8fc37544e9782576f8be00e8997887ac7de3247cf9"} Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.096811 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bppsk" event={"ID":"9b83e24a-ce7d-42b1-998f-1ede619914ff","Type":"ContainerStarted","Data":"506364bcd666e1786373d81dad748d5aab6c417cc27a956bb78b6143b7b0ffe3"} Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.109554 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dbb67c58f-k7fxg" event={"ID":"c18627ce-9a31-4f92-b1a1-6a1c63e7c583","Type":"ContainerStarted","Data":"d7f1707baebc91958388162394a061ddd8ad3556d981979b136892ca4a3bf436"} Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.130125 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zlf57" event={"ID":"2b4b9175-26ae-4cff-8dd2-7682b1408271","Type":"ContainerStarted","Data":"16cc6fefb2179350a0cc74626d6fbf5dfa16a404bd96de7b290381cc882313f1"} Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.157558 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rp8rr" event={"ID":"372e15f7-0eed-4b38-b4a9-19d3781e6e89","Type":"ContainerStarted","Data":"f9e09eea3225b6ab9506ff5d6c1236d0d7389d54ac92d17cffdc71838b7d48d0"} Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.213316 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" event={"ID":"20d4ea86-b339-4793-b0da-4127cf6d045f","Type":"ContainerDied","Data":"ed8240789644251999b20011904bf3defa7b444ae7f360c3e48cbec4776b14df"} Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.213389 4807 scope.go:117] "RemoveContainer" containerID="a378f809961fff3cfabae21d98185da91781c0706db8a3e15944137a826b40ad" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.213557 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-gdjkw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.213703 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.245696 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" event={"ID":"51d28406-b887-4c62-8b1c-f7005f6ee3c0","Type":"ContainerStarted","Data":"8dd4f1dde0f276d96c434d2a450f7df08b9f6f566f9379d3ecfd69559e60c985"} Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.383311 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.408745 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75b4587fb9-xsql7"] Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.558916 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-rp8rr" podStartSLOduration=4.558889024 podStartE2EDuration="4.558889024s" podCreationTimestamp="2025-12-02 20:20:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:20:25.413020419 +0000 UTC m=+1360.713927914" watchObservedRunningTime="2025-12-02 20:20:25.558889024 +0000 UTC m=+1360.859796519" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.580921 4807 scope.go:117] "RemoveContainer" containerID="2516495c83b694e47104d88bf1614d67ae5f9238880664e6695a3cd4be6ff1d3" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.589004 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-65d7cf6fd9-tcznw"] Dec 02 20:20:25 crc kubenswrapper[4807]: E1202 20:20:25.589614 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20d4ea86-b339-4793-b0da-4127cf6d045f" containerName="dnsmasq-dns" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.589632 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="20d4ea86-b339-4793-b0da-4127cf6d045f" containerName="dnsmasq-dns" Dec 02 20:20:25 crc kubenswrapper[4807]: E1202 20:20:25.589646 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20d4ea86-b339-4793-b0da-4127cf6d045f" containerName="init" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.589653 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="20d4ea86-b339-4793-b0da-4127cf6d045f" containerName="init" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.589942 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="20d4ea86-b339-4793-b0da-4127cf6d045f" containerName="dnsmasq-dns" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.591094 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.657568 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-gdjkw"] Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.743121 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-gdjkw"] Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.762266 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00c37d10-a573-4bfb-8241-ca5c4e9b4396-config-data\") pod \"horizon-65d7cf6fd9-tcznw\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.762338 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00c37d10-a573-4bfb-8241-ca5c4e9b4396-scripts\") pod \"horizon-65d7cf6fd9-tcznw\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.762360 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjhq9\" (UniqueName: \"kubernetes.io/projected/00c37d10-a573-4bfb-8241-ca5c4e9b4396-kube-api-access-xjhq9\") pod \"horizon-65d7cf6fd9-tcznw\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.762415 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00c37d10-a573-4bfb-8241-ca5c4e9b4396-logs\") pod \"horizon-65d7cf6fd9-tcznw\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.762452 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00c37d10-a573-4bfb-8241-ca5c4e9b4396-horizon-secret-key\") pod \"horizon-65d7cf6fd9-tcznw\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.768049 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.802862 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-65d7cf6fd9-tcznw"] Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.840298 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.864415 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00c37d10-a573-4bfb-8241-ca5c4e9b4396-horizon-secret-key\") pod \"horizon-65d7cf6fd9-tcznw\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.864518 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00c37d10-a573-4bfb-8241-ca5c4e9b4396-config-data\") pod \"horizon-65d7cf6fd9-tcznw\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.864557 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00c37d10-a573-4bfb-8241-ca5c4e9b4396-scripts\") pod \"horizon-65d7cf6fd9-tcznw\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.864578 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjhq9\" (UniqueName: \"kubernetes.io/projected/00c37d10-a573-4bfb-8241-ca5c4e9b4396-kube-api-access-xjhq9\") pod \"horizon-65d7cf6fd9-tcznw\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.864625 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00c37d10-a573-4bfb-8241-ca5c4e9b4396-logs\") pod \"horizon-65d7cf6fd9-tcznw\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.865159 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00c37d10-a573-4bfb-8241-ca5c4e9b4396-logs\") pod \"horizon-65d7cf6fd9-tcznw\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.867468 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00c37d10-a573-4bfb-8241-ca5c4e9b4396-scripts\") pod \"horizon-65d7cf6fd9-tcznw\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.868364 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00c37d10-a573-4bfb-8241-ca5c4e9b4396-config-data\") pod \"horizon-65d7cf6fd9-tcznw\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.915645 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjhq9\" (UniqueName: \"kubernetes.io/projected/00c37d10-a573-4bfb-8241-ca5c4e9b4396-kube-api-access-xjhq9\") pod \"horizon-65d7cf6fd9-tcznw\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.925106 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00c37d10-a573-4bfb-8241-ca5c4e9b4396-horizon-secret-key\") pod \"horizon-65d7cf6fd9-tcznw\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:25 crc kubenswrapper[4807]: I1202 20:20:25.996282 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.188598 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.265395 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c","Type":"ContainerStarted","Data":"5babe5dc78c315284fabb15287b95a4beea12af146df1bc9594611b5cb769373"} Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.267873 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a24c2bd-52fb-46be-9df9-f584530efc32","Type":"ContainerStarted","Data":"7761fd0a7619808fe85a9c0322f8aaa313ba5821c1e3bab768bfcfb1450b3f55"} Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.269752 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" event={"ID":"241317a5-551d-4833-8453-cab96354663b","Type":"ContainerDied","Data":"c814c6475836a467cf413a14c6ad55b21f171cb4ddc083711bca8afba1db0350"} Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.269823 4807 scope.go:117] "RemoveContainer" containerID="9ea9e38b72b896906b73848260b501f614168b7dcec2bc1bc4dde2dcee109b19" Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.269934 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-tkqcq" Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.275121 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-config\") pod \"241317a5-551d-4833-8453-cab96354663b\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.275301 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-ovsdbserver-nb\") pod \"241317a5-551d-4833-8453-cab96354663b\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.275493 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plqhx\" (UniqueName: \"kubernetes.io/projected/241317a5-551d-4833-8453-cab96354663b-kube-api-access-plqhx\") pod \"241317a5-551d-4833-8453-cab96354663b\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.275810 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-dns-svc\") pod \"241317a5-551d-4833-8453-cab96354663b\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.275881 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-ovsdbserver-sb\") pod \"241317a5-551d-4833-8453-cab96354663b\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.276154 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-dns-swift-storage-0\") pod \"241317a5-551d-4833-8453-cab96354663b\" (UID: \"241317a5-551d-4833-8453-cab96354663b\") " Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.292430 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/241317a5-551d-4833-8453-cab96354663b-kube-api-access-plqhx" (OuterVolumeSpecName: "kube-api-access-plqhx") pod "241317a5-551d-4833-8453-cab96354663b" (UID: "241317a5-551d-4833-8453-cab96354663b"). InnerVolumeSpecName "kube-api-access-plqhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.299590 4807 generic.go:334] "Generic (PLEG): container finished" podID="51d28406-b887-4c62-8b1c-f7005f6ee3c0" containerID="3e89b0a0ee43f72620258f6751cc07b2cceb26eb9b6f280d686619faa8a95a3c" exitCode=0 Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.300555 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" event={"ID":"51d28406-b887-4c62-8b1c-f7005f6ee3c0","Type":"ContainerDied","Data":"3e89b0a0ee43f72620258f6751cc07b2cceb26eb9b6f280d686619faa8a95a3c"} Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.321320 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-config" (OuterVolumeSpecName: "config") pod "241317a5-551d-4833-8453-cab96354663b" (UID: "241317a5-551d-4833-8453-cab96354663b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.351180 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "241317a5-551d-4833-8453-cab96354663b" (UID: "241317a5-551d-4833-8453-cab96354663b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.360133 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "241317a5-551d-4833-8453-cab96354663b" (UID: "241317a5-551d-4833-8453-cab96354663b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.363976 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "241317a5-551d-4833-8453-cab96354663b" (UID: "241317a5-551d-4833-8453-cab96354663b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.368842 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "241317a5-551d-4833-8453-cab96354663b" (UID: "241317a5-551d-4833-8453-cab96354663b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.378770 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.378802 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.379033 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.380238 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.380272 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plqhx\" (UniqueName: \"kubernetes.io/projected/241317a5-551d-4833-8453-cab96354663b-kube-api-access-plqhx\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.380288 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/241317a5-551d-4833-8453-cab96354663b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.929782 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-tkqcq"] Dec 02 20:20:26 crc kubenswrapper[4807]: I1202 20:20:26.959918 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-tkqcq"] Dec 02 20:20:27 crc kubenswrapper[4807]: I1202 20:20:27.016980 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20d4ea86-b339-4793-b0da-4127cf6d045f" path="/var/lib/kubelet/pods/20d4ea86-b339-4793-b0da-4127cf6d045f/volumes" Dec 02 20:20:27 crc kubenswrapper[4807]: I1202 20:20:27.017666 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="241317a5-551d-4833-8453-cab96354663b" path="/var/lib/kubelet/pods/241317a5-551d-4833-8453-cab96354663b/volumes" Dec 02 20:20:27 crc kubenswrapper[4807]: I1202 20:20:27.110010 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-65d7cf6fd9-tcznw"] Dec 02 20:20:27 crc kubenswrapper[4807]: W1202 20:20:27.146413 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00c37d10_a573_4bfb_8241_ca5c4e9b4396.slice/crio-8df23e59fd833bd2f88be17306415a171bfb1b3b56d9fd1c0f4d532930e50ee7 WatchSource:0}: Error finding container 8df23e59fd833bd2f88be17306415a171bfb1b3b56d9fd1c0f4d532930e50ee7: Status 404 returned error can't find the container with id 8df23e59fd833bd2f88be17306415a171bfb1b3b56d9fd1c0f4d532930e50ee7 Dec 02 20:20:27 crc kubenswrapper[4807]: E1202 20:20:27.261141 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod241317a5_551d_4833_8453_cab96354663b.slice/crio-c814c6475836a467cf413a14c6ad55b21f171cb4ddc083711bca8afba1db0350\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod241317a5_551d_4833_8453_cab96354663b.slice\": RecentStats: unable to find data in memory cache]" Dec 02 20:20:27 crc kubenswrapper[4807]: I1202 20:20:27.339792 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" event={"ID":"51d28406-b887-4c62-8b1c-f7005f6ee3c0","Type":"ContainerStarted","Data":"3b720548325b56bbd73e3a091b36150cb610105cc5099286cdec944dcb1bb854"} Dec 02 20:20:27 crc kubenswrapper[4807]: I1202 20:20:27.340436 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:27 crc kubenswrapper[4807]: I1202 20:20:27.358410 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c","Type":"ContainerStarted","Data":"a19c47ea0b69774ecca4da191b8a01fcd53207335be53695ca15985b92d97b05"} Dec 02 20:20:27 crc kubenswrapper[4807]: I1202 20:20:27.368808 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a24c2bd-52fb-46be-9df9-f584530efc32","Type":"ContainerStarted","Data":"4299539d7a41be88d46f14169b00a873041a1554e409cabc356b79f88ca75629"} Dec 02 20:20:27 crc kubenswrapper[4807]: I1202 20:20:27.376840 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65d7cf6fd9-tcznw" event={"ID":"00c37d10-a573-4bfb-8241-ca5c4e9b4396","Type":"ContainerStarted","Data":"8df23e59fd833bd2f88be17306415a171bfb1b3b56d9fd1c0f4d532930e50ee7"} Dec 02 20:20:27 crc kubenswrapper[4807]: I1202 20:20:27.400140 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" podStartSLOduration=5.40011441 podStartE2EDuration="5.40011441s" podCreationTimestamp="2025-12-02 20:20:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:20:27.398918015 +0000 UTC m=+1362.699825510" watchObservedRunningTime="2025-12-02 20:20:27.40011441 +0000 UTC m=+1362.701021905" Dec 02 20:20:28 crc kubenswrapper[4807]: I1202 20:20:28.299338 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:20:28 crc kubenswrapper[4807]: I1202 20:20:28.299819 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:20:28 crc kubenswrapper[4807]: I1202 20:20:28.439628 4807 generic.go:334] "Generic (PLEG): container finished" podID="5e35946b-d565-4354-9c86-8eb06b4ed154" containerID="89f4e947d7dd0ac185fabb253543b0ee6fc90bf5bf9fc4172e826a69c4f7f044" exitCode=0 Dec 02 20:20:28 crc kubenswrapper[4807]: I1202 20:20:28.443356 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-vsnlc" event={"ID":"5e35946b-d565-4354-9c86-8eb06b4ed154","Type":"ContainerDied","Data":"89f4e947d7dd0ac185fabb253543b0ee6fc90bf5bf9fc4172e826a69c4f7f044"} Dec 02 20:20:29 crc kubenswrapper[4807]: I1202 20:20:29.459004 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c","Type":"ContainerStarted","Data":"0c1915cb9f9da104e3887de5c5ee30cbccb751369c923cac807a484b0025e81c"} Dec 02 20:20:29 crc kubenswrapper[4807]: I1202 20:20:29.459119 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" containerName="glance-log" containerID="cri-o://a19c47ea0b69774ecca4da191b8a01fcd53207335be53695ca15985b92d97b05" gracePeriod=30 Dec 02 20:20:29 crc kubenswrapper[4807]: I1202 20:20:29.459520 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" containerName="glance-httpd" containerID="cri-o://0c1915cb9f9da104e3887de5c5ee30cbccb751369c923cac807a484b0025e81c" gracePeriod=30 Dec 02 20:20:29 crc kubenswrapper[4807]: I1202 20:20:29.466662 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a24c2bd-52fb-46be-9df9-f584530efc32","Type":"ContainerStarted","Data":"bd26f685403ee381d454ce4d67e7852ce931899d2fa15b2adb729b2a07c06f12"} Dec 02 20:20:29 crc kubenswrapper[4807]: I1202 20:20:29.466931 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7a24c2bd-52fb-46be-9df9-f584530efc32" containerName="glance-log" containerID="cri-o://4299539d7a41be88d46f14169b00a873041a1554e409cabc356b79f88ca75629" gracePeriod=30 Dec 02 20:20:29 crc kubenswrapper[4807]: I1202 20:20:29.467050 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7a24c2bd-52fb-46be-9df9-f584530efc32" containerName="glance-httpd" containerID="cri-o://bd26f685403ee381d454ce4d67e7852ce931899d2fa15b2adb729b2a07c06f12" gracePeriod=30 Dec 02 20:20:29 crc kubenswrapper[4807]: I1202 20:20:29.487707 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.487686611 podStartE2EDuration="7.487686611s" podCreationTimestamp="2025-12-02 20:20:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:20:29.480920231 +0000 UTC m=+1364.781827746" watchObservedRunningTime="2025-12-02 20:20:29.487686611 +0000 UTC m=+1364.788594096" Dec 02 20:20:29 crc kubenswrapper[4807]: I1202 20:20:29.511461 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.511441764 podStartE2EDuration="7.511441764s" podCreationTimestamp="2025-12-02 20:20:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:20:29.510880537 +0000 UTC m=+1364.811788042" watchObservedRunningTime="2025-12-02 20:20:29.511441764 +0000 UTC m=+1364.812349259" Dec 02 20:20:30 crc kubenswrapper[4807]: I1202 20:20:30.489782 4807 generic.go:334] "Generic (PLEG): container finished" podID="80aba33b-991b-4849-8a5b-cba332a74c91" containerID="4742c9b21a7f9c5e7eae76e27daec65c71239088aaa0cd26d31987b735ceb989" exitCode=0 Dec 02 20:20:30 crc kubenswrapper[4807]: I1202 20:20:30.489859 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-skwsf" event={"ID":"80aba33b-991b-4849-8a5b-cba332a74c91","Type":"ContainerDied","Data":"4742c9b21a7f9c5e7eae76e27daec65c71239088aaa0cd26d31987b735ceb989"} Dec 02 20:20:30 crc kubenswrapper[4807]: I1202 20:20:30.500451 4807 generic.go:334] "Generic (PLEG): container finished" podID="fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" containerID="0c1915cb9f9da104e3887de5c5ee30cbccb751369c923cac807a484b0025e81c" exitCode=0 Dec 02 20:20:30 crc kubenswrapper[4807]: I1202 20:20:30.500486 4807 generic.go:334] "Generic (PLEG): container finished" podID="fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" containerID="a19c47ea0b69774ecca4da191b8a01fcd53207335be53695ca15985b92d97b05" exitCode=143 Dec 02 20:20:30 crc kubenswrapper[4807]: I1202 20:20:30.500539 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c","Type":"ContainerDied","Data":"0c1915cb9f9da104e3887de5c5ee30cbccb751369c923cac807a484b0025e81c"} Dec 02 20:20:30 crc kubenswrapper[4807]: I1202 20:20:30.500572 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c","Type":"ContainerDied","Data":"a19c47ea0b69774ecca4da191b8a01fcd53207335be53695ca15985b92d97b05"} Dec 02 20:20:30 crc kubenswrapper[4807]: I1202 20:20:30.508759 4807 generic.go:334] "Generic (PLEG): container finished" podID="7a24c2bd-52fb-46be-9df9-f584530efc32" containerID="bd26f685403ee381d454ce4d67e7852ce931899d2fa15b2adb729b2a07c06f12" exitCode=0 Dec 02 20:20:30 crc kubenswrapper[4807]: I1202 20:20:30.508798 4807 generic.go:334] "Generic (PLEG): container finished" podID="7a24c2bd-52fb-46be-9df9-f584530efc32" containerID="4299539d7a41be88d46f14169b00a873041a1554e409cabc356b79f88ca75629" exitCode=143 Dec 02 20:20:30 crc kubenswrapper[4807]: I1202 20:20:30.508835 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a24c2bd-52fb-46be-9df9-f584530efc32","Type":"ContainerDied","Data":"bd26f685403ee381d454ce4d67e7852ce931899d2fa15b2adb729b2a07c06f12"} Dec 02 20:20:30 crc kubenswrapper[4807]: I1202 20:20:30.508866 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a24c2bd-52fb-46be-9df9-f584530efc32","Type":"ContainerDied","Data":"4299539d7a41be88d46f14169b00a873041a1554e409cabc356b79f88ca75629"} Dec 02 20:20:31 crc kubenswrapper[4807]: I1202 20:20:31.891571 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7dbb67c58f-k7fxg"] Dec 02 20:20:31 crc kubenswrapper[4807]: I1202 20:20:31.933947 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7cc8fc8c44-b8pmd"] Dec 02 20:20:31 crc kubenswrapper[4807]: E1202 20:20:31.934477 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241317a5-551d-4833-8453-cab96354663b" containerName="init" Dec 02 20:20:31 crc kubenswrapper[4807]: I1202 20:20:31.934491 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="241317a5-551d-4833-8453-cab96354663b" containerName="init" Dec 02 20:20:31 crc kubenswrapper[4807]: I1202 20:20:31.934728 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="241317a5-551d-4833-8453-cab96354663b" containerName="init" Dec 02 20:20:31 crc kubenswrapper[4807]: I1202 20:20:31.935968 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:31 crc kubenswrapper[4807]: I1202 20:20:31.942644 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 02 20:20:31 crc kubenswrapper[4807]: I1202 20:20:31.993909 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cc8fc8c44-b8pmd"] Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.066299 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-65d7cf6fd9-tcznw"] Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.106217 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5cbfd7dcb-hzflv"] Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.107307 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk96s\" (UniqueName: \"kubernetes.io/projected/62551774-7dc7-4727-a79d-f92d4f82d560-kube-api-access-gk96s\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.107393 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-combined-ca-bundle\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.107691 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62551774-7dc7-4727-a79d-f92d4f82d560-scripts\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.107771 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62551774-7dc7-4727-a79d-f92d4f82d560-logs\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.108058 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-horizon-tls-certs\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.108117 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62551774-7dc7-4727-a79d-f92d4f82d560-config-data\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.108276 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-horizon-secret-key\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.108609 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.127801 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cbfd7dcb-hzflv"] Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.212442 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62551774-7dc7-4727-a79d-f92d4f82d560-scripts\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.217543 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62551774-7dc7-4727-a79d-f92d4f82d560-logs\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.219083 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-289rb\" (UniqueName: \"kubernetes.io/projected/f5570109-9e91-473c-8a41-47081ace3591-kube-api-access-289rb\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.219645 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5570109-9e91-473c-8a41-47081ace3591-scripts\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.220133 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5570109-9e91-473c-8a41-47081ace3591-horizon-tls-certs\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.220395 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f5570109-9e91-473c-8a41-47081ace3591-horizon-secret-key\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.220623 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5570109-9e91-473c-8a41-47081ace3591-logs\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.220941 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-horizon-tls-certs\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.222946 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62551774-7dc7-4727-a79d-f92d4f82d560-config-data\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.223501 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5570109-9e91-473c-8a41-47081ace3591-combined-ca-bundle\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.223668 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-horizon-secret-key\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.223980 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5570109-9e91-473c-8a41-47081ace3591-config-data\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.224186 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk96s\" (UniqueName: \"kubernetes.io/projected/62551774-7dc7-4727-a79d-f92d4f82d560-kube-api-access-gk96s\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.224353 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-combined-ca-bundle\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.224530 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62551774-7dc7-4727-a79d-f92d4f82d560-config-data\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.214513 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62551774-7dc7-4727-a79d-f92d4f82d560-scripts\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.218844 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62551774-7dc7-4727-a79d-f92d4f82d560-logs\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.243316 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-horizon-secret-key\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.243328 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-combined-ca-bundle\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.243732 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-horizon-tls-certs\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.266644 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk96s\" (UniqueName: \"kubernetes.io/projected/62551774-7dc7-4727-a79d-f92d4f82d560-kube-api-access-gk96s\") pod \"horizon-7cc8fc8c44-b8pmd\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.287448 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.328599 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-289rb\" (UniqueName: \"kubernetes.io/projected/f5570109-9e91-473c-8a41-47081ace3591-kube-api-access-289rb\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.329079 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5570109-9e91-473c-8a41-47081ace3591-scripts\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.332979 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5570109-9e91-473c-8a41-47081ace3591-scripts\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.334322 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5570109-9e91-473c-8a41-47081ace3591-horizon-tls-certs\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.334424 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f5570109-9e91-473c-8a41-47081ace3591-horizon-secret-key\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.334537 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5570109-9e91-473c-8a41-47081ace3591-logs\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.334688 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5570109-9e91-473c-8a41-47081ace3591-combined-ca-bundle\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.335270 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5570109-9e91-473c-8a41-47081ace3591-config-data\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.336603 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5570109-9e91-473c-8a41-47081ace3591-config-data\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.340077 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5570109-9e91-473c-8a41-47081ace3591-logs\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.356218 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f5570109-9e91-473c-8a41-47081ace3591-horizon-secret-key\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.359403 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5570109-9e91-473c-8a41-47081ace3591-combined-ca-bundle\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.369311 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5570109-9e91-473c-8a41-47081ace3591-horizon-tls-certs\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.372404 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-289rb\" (UniqueName: \"kubernetes.io/projected/f5570109-9e91-473c-8a41-47081ace3591-kube-api-access-289rb\") pod \"horizon-5cbfd7dcb-hzflv\" (UID: \"f5570109-9e91-473c-8a41-47081ace3591\") " pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:32 crc kubenswrapper[4807]: I1202 20:20:32.438438 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:20:33 crc kubenswrapper[4807]: I1202 20:20:33.317679 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:20:33 crc kubenswrapper[4807]: I1202 20:20:33.410691 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-tdkbv"] Dec 02 20:20:33 crc kubenswrapper[4807]: I1202 20:20:33.411098 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" podUID="9e632286-c5c3-48a0-a79c-445edea3b864" containerName="dnsmasq-dns" containerID="cri-o://66603f698908783c8443c16b6ab879e122ba6a4191f63cfc19f612f4335ef295" gracePeriod=10 Dec 02 20:20:33 crc kubenswrapper[4807]: I1202 20:20:33.571900 4807 generic.go:334] "Generic (PLEG): container finished" podID="9e632286-c5c3-48a0-a79c-445edea3b864" containerID="66603f698908783c8443c16b6ab879e122ba6a4191f63cfc19f612f4335ef295" exitCode=0 Dec 02 20:20:33 crc kubenswrapper[4807]: I1202 20:20:33.571970 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" event={"ID":"9e632286-c5c3-48a0-a79c-445edea3b864","Type":"ContainerDied","Data":"66603f698908783c8443c16b6ab879e122ba6a4191f63cfc19f612f4335ef295"} Dec 02 20:20:35 crc kubenswrapper[4807]: I1202 20:20:35.027795 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" podUID="9e632286-c5c3-48a0-a79c-445edea3b864" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.496036 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.551614 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n2zt\" (UniqueName: \"kubernetes.io/projected/80aba33b-991b-4849-8a5b-cba332a74c91-kube-api-access-8n2zt\") pod \"80aba33b-991b-4849-8a5b-cba332a74c91\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.551823 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-credential-keys\") pod \"80aba33b-991b-4849-8a5b-cba332a74c91\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.551886 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-config-data\") pod \"80aba33b-991b-4849-8a5b-cba332a74c91\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.551914 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-scripts\") pod \"80aba33b-991b-4849-8a5b-cba332a74c91\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.551966 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-fernet-keys\") pod \"80aba33b-991b-4849-8a5b-cba332a74c91\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.552058 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-combined-ca-bundle\") pod \"80aba33b-991b-4849-8a5b-cba332a74c91\" (UID: \"80aba33b-991b-4849-8a5b-cba332a74c91\") " Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.558453 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80aba33b-991b-4849-8a5b-cba332a74c91-kube-api-access-8n2zt" (OuterVolumeSpecName: "kube-api-access-8n2zt") pod "80aba33b-991b-4849-8a5b-cba332a74c91" (UID: "80aba33b-991b-4849-8a5b-cba332a74c91"). InnerVolumeSpecName "kube-api-access-8n2zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.562442 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "80aba33b-991b-4849-8a5b-cba332a74c91" (UID: "80aba33b-991b-4849-8a5b-cba332a74c91"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.566990 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "80aba33b-991b-4849-8a5b-cba332a74c91" (UID: "80aba33b-991b-4849-8a5b-cba332a74c91"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.568958 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-scripts" (OuterVolumeSpecName: "scripts") pod "80aba33b-991b-4849-8a5b-cba332a74c91" (UID: "80aba33b-991b-4849-8a5b-cba332a74c91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.589976 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-config-data" (OuterVolumeSpecName: "config-data") pod "80aba33b-991b-4849-8a5b-cba332a74c91" (UID: "80aba33b-991b-4849-8a5b-cba332a74c91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.591141 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80aba33b-991b-4849-8a5b-cba332a74c91" (UID: "80aba33b-991b-4849-8a5b-cba332a74c91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.615739 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-skwsf" event={"ID":"80aba33b-991b-4849-8a5b-cba332a74c91","Type":"ContainerDied","Data":"5d7a049585863216a6ce5ece1e936680e272e510e9c531f0d1897ea84eaec501"} Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.615799 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d7a049585863216a6ce5ece1e936680e272e510e9c531f0d1897ea84eaec501" Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.615863 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-skwsf" Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.659245 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n2zt\" (UniqueName: \"kubernetes.io/projected/80aba33b-991b-4849-8a5b-cba332a74c91-kube-api-access-8n2zt\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.659311 4807 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.659323 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.659333 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.659346 4807 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:36 crc kubenswrapper[4807]: I1202 20:20:36.659355 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80aba33b-991b-4849-8a5b-cba332a74c91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.615414 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-skwsf"] Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.626601 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-skwsf"] Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.699202 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vm4lf"] Dec 02 20:20:37 crc kubenswrapper[4807]: E1202 20:20:37.699833 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80aba33b-991b-4849-8a5b-cba332a74c91" containerName="keystone-bootstrap" Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.699854 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="80aba33b-991b-4849-8a5b-cba332a74c91" containerName="keystone-bootstrap" Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.700098 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="80aba33b-991b-4849-8a5b-cba332a74c91" containerName="keystone-bootstrap" Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.700957 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.704759 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.704968 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.705305 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.705450 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-psn74" Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.706299 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.712947 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vm4lf"] Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.892471 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-fernet-keys\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.892568 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc89d\" (UniqueName: \"kubernetes.io/projected/9af9ee27-367c-4051-95bd-78ede0827b19-kube-api-access-bc89d\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.892635 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-combined-ca-bundle\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.892682 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-scripts\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.892712 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-config-data\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:37 crc kubenswrapper[4807]: I1202 20:20:37.892772 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-credential-keys\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:38 crc kubenswrapper[4807]: I1202 20:20:38.000666 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-combined-ca-bundle\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:38 crc kubenswrapper[4807]: I1202 20:20:38.000775 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-scripts\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:38 crc kubenswrapper[4807]: I1202 20:20:38.000819 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-config-data\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:38 crc kubenswrapper[4807]: I1202 20:20:38.000884 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-credential-keys\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:38 crc kubenswrapper[4807]: I1202 20:20:38.000978 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-fernet-keys\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:38 crc kubenswrapper[4807]: I1202 20:20:38.001031 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc89d\" (UniqueName: \"kubernetes.io/projected/9af9ee27-367c-4051-95bd-78ede0827b19-kube-api-access-bc89d\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:38 crc kubenswrapper[4807]: I1202 20:20:38.008316 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-credential-keys\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:38 crc kubenswrapper[4807]: I1202 20:20:38.009084 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-config-data\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:38 crc kubenswrapper[4807]: I1202 20:20:38.009550 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-fernet-keys\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:38 crc kubenswrapper[4807]: I1202 20:20:38.010346 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-combined-ca-bundle\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:38 crc kubenswrapper[4807]: I1202 20:20:38.012065 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-scripts\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:38 crc kubenswrapper[4807]: I1202 20:20:38.033841 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc89d\" (UniqueName: \"kubernetes.io/projected/9af9ee27-367c-4051-95bd-78ede0827b19-kube-api-access-bc89d\") pod \"keystone-bootstrap-vm4lf\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:38 crc kubenswrapper[4807]: I1202 20:20:38.322501 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:20:38 crc kubenswrapper[4807]: I1202 20:20:38.995092 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80aba33b-991b-4849-8a5b-cba332a74c91" path="/var/lib/kubelet/pods/80aba33b-991b-4849-8a5b-cba332a74c91/volumes" Dec 02 20:20:40 crc kubenswrapper[4807]: I1202 20:20:40.027148 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" podUID="9e632286-c5c3-48a0-a79c-445edea3b864" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.027123 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" podUID="9e632286-c5c3-48a0-a79c-445edea3b864" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.027923 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:20:45 crc kubenswrapper[4807]: E1202 20:20:45.287003 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 02 20:20:45 crc kubenswrapper[4807]: E1202 20:20:45.287565 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndch55dh559h678h64dh5dbh54bh74h558h5bdh6bhb5h77h67fh555h77h697h66fh5fdh88h57fh576h99h65chd9h6bh566hf7hfbh5b8h67bh7bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fgtth,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7dbb67c58f-k7fxg_openstack(c18627ce-9a31-4f92-b1a1-6a1c63e7c583): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 20:20:45 crc kubenswrapper[4807]: E1202 20:20:45.289822 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7dbb67c58f-k7fxg" podUID="c18627ce-9a31-4f92-b1a1-6a1c63e7c583" Dec 02 20:20:45 crc kubenswrapper[4807]: E1202 20:20:45.294793 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 02 20:20:45 crc kubenswrapper[4807]: E1202 20:20:45.295115 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7fh5bbh687h5dfhfh5b7h68h88h5c4h64ch588hd6h598h668h649hbdh68dhc8hb7hbbh85h88h74h686h74hch68bh549h5bbhd9h684h67cq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zvdkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-75b4587fb9-xsql7_openstack(72652f32-e037-48c8-850e-fb193ad32c5a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 20:20:45 crc kubenswrapper[4807]: E1202 20:20:45.297234 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-75b4587fb9-xsql7" podUID="72652f32-e037-48c8-850e-fb193ad32c5a" Dec 02 20:20:45 crc kubenswrapper[4807]: E1202 20:20:45.346603 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 02 20:20:45 crc kubenswrapper[4807]: E1202 20:20:45.348124 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n674h675hddh549hdch58dh96h65fh669h56h65h5dbh596hch86h5d5h7bh599hc5h8dh559h56ch65dhb4h54ch576h5dh584h655h54h679h655q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xjhq9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-65d7cf6fd9-tcznw_openstack(00c37d10-a573-4bfb-8241-ca5c4e9b4396): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 20:20:45 crc kubenswrapper[4807]: E1202 20:20:45.373335 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-65d7cf6fd9-tcznw" podUID="00c37d10-a573-4bfb-8241-ca5c4e9b4396" Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.448059 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-vsnlc" Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.571400 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6pfp\" (UniqueName: \"kubernetes.io/projected/5e35946b-d565-4354-9c86-8eb06b4ed154-kube-api-access-b6pfp\") pod \"5e35946b-d565-4354-9c86-8eb06b4ed154\" (UID: \"5e35946b-d565-4354-9c86-8eb06b4ed154\") " Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.572173 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-config-data\") pod \"5e35946b-d565-4354-9c86-8eb06b4ed154\" (UID: \"5e35946b-d565-4354-9c86-8eb06b4ed154\") " Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.572502 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-db-sync-config-data\") pod \"5e35946b-d565-4354-9c86-8eb06b4ed154\" (UID: \"5e35946b-d565-4354-9c86-8eb06b4ed154\") " Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.572891 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-combined-ca-bundle\") pod \"5e35946b-d565-4354-9c86-8eb06b4ed154\" (UID: \"5e35946b-d565-4354-9c86-8eb06b4ed154\") " Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.582872 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5e35946b-d565-4354-9c86-8eb06b4ed154" (UID: "5e35946b-d565-4354-9c86-8eb06b4ed154"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.583413 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e35946b-d565-4354-9c86-8eb06b4ed154-kube-api-access-b6pfp" (OuterVolumeSpecName: "kube-api-access-b6pfp") pod "5e35946b-d565-4354-9c86-8eb06b4ed154" (UID: "5e35946b-d565-4354-9c86-8eb06b4ed154"). InnerVolumeSpecName "kube-api-access-b6pfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.607680 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e35946b-d565-4354-9c86-8eb06b4ed154" (UID: "5e35946b-d565-4354-9c86-8eb06b4ed154"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.638518 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-config-data" (OuterVolumeSpecName: "config-data") pod "5e35946b-d565-4354-9c86-8eb06b4ed154" (UID: "5e35946b-d565-4354-9c86-8eb06b4ed154"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.675708 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6pfp\" (UniqueName: \"kubernetes.io/projected/5e35946b-d565-4354-9c86-8eb06b4ed154-kube-api-access-b6pfp\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.675760 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.675862 4807 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.675878 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e35946b-d565-4354-9c86-8eb06b4ed154-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.728081 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-vsnlc" event={"ID":"5e35946b-d565-4354-9c86-8eb06b4ed154","Type":"ContainerDied","Data":"eca2c63658f78d557f21db2f86572d8858ad989e54e23d7c9177c44f6b22d686"} Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.728168 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eca2c63658f78d557f21db2f86572d8858ad989e54e23d7c9177c44f6b22d686" Dec 02 20:20:45 crc kubenswrapper[4807]: I1202 20:20:45.729091 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-vsnlc" Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.789232 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 20:20:46 crc kubenswrapper[4807]: E1202 20:20:46.790482 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e35946b-d565-4354-9c86-8eb06b4ed154" containerName="watcher-db-sync" Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.790495 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e35946b-d565-4354-9c86-8eb06b4ed154" containerName="watcher-db-sync" Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.790737 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e35946b-d565-4354-9c86-8eb06b4ed154" containerName="watcher-db-sync" Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.791426 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.794637 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.794820 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-5lgzc" Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.805736 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.905240 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqqhl\" (UniqueName: \"kubernetes.io/projected/95945b94-12eb-4077-b233-45c5c2b6b51d-kube-api-access-jqqhl\") pod \"watcher-decision-engine-0\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.905385 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-config-data\") pod \"watcher-decision-engine-0\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.905407 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.905478 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95945b94-12eb-4077-b233-45c5c2b6b51d-logs\") pod \"watcher-decision-engine-0\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.905499 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.945690 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.947429 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.950841 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.962341 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.963845 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 02 20:20:46 crc kubenswrapper[4807]: I1202 20:20:46.968108 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:46.998889 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:46.998956 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.008033 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.008072 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-config-data\") pod \"watcher-decision-engine-0\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.008111 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d68c545-0435-4f66-a351-3ccba6fa68a3-config-data\") pod \"watcher-applier-0\" (UID: \"1d68c545-0435-4f66-a351-3ccba6fa68a3\") " pod="openstack/watcher-applier-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.008145 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-config-data\") pod \"watcher-api-0\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " pod="openstack/watcher-api-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.008175 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95945b94-12eb-4077-b233-45c5c2b6b51d-logs\") pod \"watcher-decision-engine-0\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.008201 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.008231 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lwqt\" (UniqueName: \"kubernetes.io/projected/1d68c545-0435-4f66-a351-3ccba6fa68a3-kube-api-access-6lwqt\") pod \"watcher-applier-0\" (UID: \"1d68c545-0435-4f66-a351-3ccba6fa68a3\") " pod="openstack/watcher-applier-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.008285 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7pbp\" (UniqueName: \"kubernetes.io/projected/4e60dc88-91d1-4325-9832-f9a921502710-kube-api-access-p7pbp\") pod \"watcher-api-0\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " pod="openstack/watcher-api-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.008306 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqqhl\" (UniqueName: \"kubernetes.io/projected/95945b94-12eb-4077-b233-45c5c2b6b51d-kube-api-access-jqqhl\") pod \"watcher-decision-engine-0\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.008329 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " pod="openstack/watcher-api-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.008360 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d68c545-0435-4f66-a351-3ccba6fa68a3-logs\") pod \"watcher-applier-0\" (UID: \"1d68c545-0435-4f66-a351-3ccba6fa68a3\") " pod="openstack/watcher-applier-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.008387 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e60dc88-91d1-4325-9832-f9a921502710-logs\") pod \"watcher-api-0\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " pod="openstack/watcher-api-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.008412 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " pod="openstack/watcher-api-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.008430 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d68c545-0435-4f66-a351-3ccba6fa68a3-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"1d68c545-0435-4f66-a351-3ccba6fa68a3\") " pod="openstack/watcher-applier-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.010341 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95945b94-12eb-4077-b233-45c5c2b6b51d-logs\") pod \"watcher-decision-engine-0\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.021706 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.023547 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-config-data\") pod \"watcher-decision-engine-0\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.036368 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.045981 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqqhl\" (UniqueName: \"kubernetes.io/projected/95945b94-12eb-4077-b233-45c5c2b6b51d-kube-api-access-jqqhl\") pod \"watcher-decision-engine-0\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.113465 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-config-data\") pod \"watcher-api-0\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " pod="openstack/watcher-api-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.113566 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lwqt\" (UniqueName: \"kubernetes.io/projected/1d68c545-0435-4f66-a351-3ccba6fa68a3-kube-api-access-6lwqt\") pod \"watcher-applier-0\" (UID: \"1d68c545-0435-4f66-a351-3ccba6fa68a3\") " pod="openstack/watcher-applier-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.113624 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7pbp\" (UniqueName: \"kubernetes.io/projected/4e60dc88-91d1-4325-9832-f9a921502710-kube-api-access-p7pbp\") pod \"watcher-api-0\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " pod="openstack/watcher-api-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.113648 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " pod="openstack/watcher-api-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.113689 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d68c545-0435-4f66-a351-3ccba6fa68a3-logs\") pod \"watcher-applier-0\" (UID: \"1d68c545-0435-4f66-a351-3ccba6fa68a3\") " pod="openstack/watcher-applier-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.113747 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e60dc88-91d1-4325-9832-f9a921502710-logs\") pod \"watcher-api-0\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " pod="openstack/watcher-api-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.113775 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " pod="openstack/watcher-api-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.113795 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d68c545-0435-4f66-a351-3ccba6fa68a3-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"1d68c545-0435-4f66-a351-3ccba6fa68a3\") " pod="openstack/watcher-applier-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.115452 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d68c545-0435-4f66-a351-3ccba6fa68a3-logs\") pod \"watcher-applier-0\" (UID: \"1d68c545-0435-4f66-a351-3ccba6fa68a3\") " pod="openstack/watcher-applier-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.116896 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e60dc88-91d1-4325-9832-f9a921502710-logs\") pod \"watcher-api-0\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " pod="openstack/watcher-api-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.117196 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.117852 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " pod="openstack/watcher-api-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.118190 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " pod="openstack/watcher-api-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.118418 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-config-data\") pod \"watcher-api-0\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " pod="openstack/watcher-api-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.120666 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d68c545-0435-4f66-a351-3ccba6fa68a3-config-data\") pod \"watcher-applier-0\" (UID: \"1d68c545-0435-4f66-a351-3ccba6fa68a3\") " pod="openstack/watcher-applier-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.124228 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d68c545-0435-4f66-a351-3ccba6fa68a3-config-data\") pod \"watcher-applier-0\" (UID: \"1d68c545-0435-4f66-a351-3ccba6fa68a3\") " pod="openstack/watcher-applier-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.137371 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d68c545-0435-4f66-a351-3ccba6fa68a3-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"1d68c545-0435-4f66-a351-3ccba6fa68a3\") " pod="openstack/watcher-applier-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.137918 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lwqt\" (UniqueName: \"kubernetes.io/projected/1d68c545-0435-4f66-a351-3ccba6fa68a3-kube-api-access-6lwqt\") pod \"watcher-applier-0\" (UID: \"1d68c545-0435-4f66-a351-3ccba6fa68a3\") " pod="openstack/watcher-applier-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.143208 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7pbp\" (UniqueName: \"kubernetes.io/projected/4e60dc88-91d1-4325-9832-f9a921502710-kube-api-access-p7pbp\") pod \"watcher-api-0\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " pod="openstack/watcher-api-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.275542 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.297525 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.746898 4807 generic.go:334] "Generic (PLEG): container finished" podID="372e15f7-0eed-4b38-b4a9-19d3781e6e89" containerID="f9e09eea3225b6ab9506ff5d6c1236d0d7389d54ac92d17cffdc71838b7d48d0" exitCode=0 Dec 02 20:20:47 crc kubenswrapper[4807]: I1202 20:20:47.746966 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rp8rr" event={"ID":"372e15f7-0eed-4b38-b4a9-19d3781e6e89","Type":"ContainerDied","Data":"f9e09eea3225b6ab9506ff5d6c1236d0d7389d54ac92d17cffdc71838b7d48d0"} Dec 02 20:20:53 crc kubenswrapper[4807]: I1202 20:20:53.195631 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 20:20:53 crc kubenswrapper[4807]: I1202 20:20:53.196150 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 20:20:53 crc kubenswrapper[4807]: I1202 20:20:53.396581 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 20:20:53 crc kubenswrapper[4807]: I1202 20:20:53.396641 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 20:20:55 crc kubenswrapper[4807]: I1202 20:20:55.027490 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" podUID="9e632286-c5c3-48a0-a79c-445edea3b864" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: i/o timeout" Dec 02 20:20:56 crc kubenswrapper[4807]: E1202 20:20:56.181944 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 02 20:20:56 crc kubenswrapper[4807]: E1202 20:20:56.182184 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2zvj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-bppsk_openstack(9b83e24a-ce7d-42b1-998f-1ede619914ff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 20:20:56 crc kubenswrapper[4807]: E1202 20:20:56.183410 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-bppsk" podUID="9b83e24a-ce7d-42b1-998f-1ede619914ff" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.359006 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rp8rr" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.366707 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.394037 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.432323 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.465352 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/372e15f7-0eed-4b38-b4a9-19d3781e6e89-config\") pod \"372e15f7-0eed-4b38-b4a9-19d3781e6e89\" (UID: \"372e15f7-0eed-4b38-b4a9-19d3781e6e89\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.465692 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00c37d10-a573-4bfb-8241-ca5c4e9b4396-horizon-secret-key\") pod \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.465772 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72652f32-e037-48c8-850e-fb193ad32c5a-config-data\") pod \"72652f32-e037-48c8-850e-fb193ad32c5a\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.466535 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00c37d10-a573-4bfb-8241-ca5c4e9b4396-scripts\") pod \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.466589 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/372e15f7-0eed-4b38-b4a9-19d3781e6e89-combined-ca-bundle\") pod \"372e15f7-0eed-4b38-b4a9-19d3781e6e89\" (UID: \"372e15f7-0eed-4b38-b4a9-19d3781e6e89\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.466887 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72652f32-e037-48c8-850e-fb193ad32c5a-logs\") pod \"72652f32-e037-48c8-850e-fb193ad32c5a\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.466996 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00c37d10-a573-4bfb-8241-ca5c4e9b4396-config-data\") pod \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.467040 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvdkk\" (UniqueName: \"kubernetes.io/projected/72652f32-e037-48c8-850e-fb193ad32c5a-kube-api-access-zvdkk\") pod \"72652f32-e037-48c8-850e-fb193ad32c5a\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.467065 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjhq9\" (UniqueName: \"kubernetes.io/projected/00c37d10-a573-4bfb-8241-ca5c4e9b4396-kube-api-access-xjhq9\") pod \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.467999 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72652f32-e037-48c8-850e-fb193ad32c5a-config-data" (OuterVolumeSpecName: "config-data") pod "72652f32-e037-48c8-850e-fb193ad32c5a" (UID: "72652f32-e037-48c8-850e-fb193ad32c5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.468600 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72652f32-e037-48c8-850e-fb193ad32c5a-logs" (OuterVolumeSpecName: "logs") pod "72652f32-e037-48c8-850e-fb193ad32c5a" (UID: "72652f32-e037-48c8-850e-fb193ad32c5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.469110 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00c37d10-a573-4bfb-8241-ca5c4e9b4396-scripts" (OuterVolumeSpecName: "scripts") pod "00c37d10-a573-4bfb-8241-ca5c4e9b4396" (UID: "00c37d10-a573-4bfb-8241-ca5c4e9b4396"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.469133 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/72652f32-e037-48c8-850e-fb193ad32c5a-horizon-secret-key\") pod \"72652f32-e037-48c8-850e-fb193ad32c5a\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.469218 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx4v7\" (UniqueName: \"kubernetes.io/projected/372e15f7-0eed-4b38-b4a9-19d3781e6e89-kube-api-access-jx4v7\") pod \"372e15f7-0eed-4b38-b4a9-19d3781e6e89\" (UID: \"372e15f7-0eed-4b38-b4a9-19d3781e6e89\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.469314 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00c37d10-a573-4bfb-8241-ca5c4e9b4396-logs\") pod \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\" (UID: \"00c37d10-a573-4bfb-8241-ca5c4e9b4396\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.469359 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72652f32-e037-48c8-850e-fb193ad32c5a-scripts\") pod \"72652f32-e037-48c8-850e-fb193ad32c5a\" (UID: \"72652f32-e037-48c8-850e-fb193ad32c5a\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.469514 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00c37d10-a573-4bfb-8241-ca5c4e9b4396-config-data" (OuterVolumeSpecName: "config-data") pod "00c37d10-a573-4bfb-8241-ca5c4e9b4396" (UID: "00c37d10-a573-4bfb-8241-ca5c4e9b4396"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.470045 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00c37d10-a573-4bfb-8241-ca5c4e9b4396-logs" (OuterVolumeSpecName: "logs") pod "00c37d10-a573-4bfb-8241-ca5c4e9b4396" (UID: "00c37d10-a573-4bfb-8241-ca5c4e9b4396"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.470500 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72652f32-e037-48c8-850e-fb193ad32c5a-scripts" (OuterVolumeSpecName: "scripts") pod "72652f32-e037-48c8-850e-fb193ad32c5a" (UID: "72652f32-e037-48c8-850e-fb193ad32c5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.470932 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00c37d10-a573-4bfb-8241-ca5c4e9b4396-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.470953 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00c37d10-a573-4bfb-8241-ca5c4e9b4396-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.470966 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72652f32-e037-48c8-850e-fb193ad32c5a-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.470976 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72652f32-e037-48c8-850e-fb193ad32c5a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.470985 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00c37d10-a573-4bfb-8241-ca5c4e9b4396-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.470993 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72652f32-e037-48c8-850e-fb193ad32c5a-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.471561 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00c37d10-a573-4bfb-8241-ca5c4e9b4396-kube-api-access-xjhq9" (OuterVolumeSpecName: "kube-api-access-xjhq9") pod "00c37d10-a573-4bfb-8241-ca5c4e9b4396" (UID: "00c37d10-a573-4bfb-8241-ca5c4e9b4396"). InnerVolumeSpecName "kube-api-access-xjhq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.472168 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.474169 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72652f32-e037-48c8-850e-fb193ad32c5a-kube-api-access-zvdkk" (OuterVolumeSpecName: "kube-api-access-zvdkk") pod "72652f32-e037-48c8-850e-fb193ad32c5a" (UID: "72652f32-e037-48c8-850e-fb193ad32c5a"). InnerVolumeSpecName "kube-api-access-zvdkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.475992 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/372e15f7-0eed-4b38-b4a9-19d3781e6e89-kube-api-access-jx4v7" (OuterVolumeSpecName: "kube-api-access-jx4v7") pod "372e15f7-0eed-4b38-b4a9-19d3781e6e89" (UID: "372e15f7-0eed-4b38-b4a9-19d3781e6e89"). InnerVolumeSpecName "kube-api-access-jx4v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.479577 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72652f32-e037-48c8-850e-fb193ad32c5a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "72652f32-e037-48c8-850e-fb193ad32c5a" (UID: "72652f32-e037-48c8-850e-fb193ad32c5a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.480172 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00c37d10-a573-4bfb-8241-ca5c4e9b4396-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "00c37d10-a573-4bfb-8241-ca5c4e9b4396" (UID: "00c37d10-a573-4bfb-8241-ca5c4e9b4396"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.507855 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/372e15f7-0eed-4b38-b4a9-19d3781e6e89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "372e15f7-0eed-4b38-b4a9-19d3781e6e89" (UID: "372e15f7-0eed-4b38-b4a9-19d3781e6e89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.545910 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/372e15f7-0eed-4b38-b4a9-19d3781e6e89-config" (OuterVolumeSpecName: "config") pod "372e15f7-0eed-4b38-b4a9-19d3781e6e89" (UID: "372e15f7-0eed-4b38-b4a9-19d3781e6e89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.572560 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-ovsdbserver-nb\") pod \"9e632286-c5c3-48a0-a79c-445edea3b864\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.572620 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-scripts\") pod \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.572673 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-ovsdbserver-sb\") pod \"9e632286-c5c3-48a0-a79c-445edea3b864\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.572705 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-dns-svc\") pod \"9e632286-c5c3-48a0-a79c-445edea3b864\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.572799 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-config-data\") pod \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.572833 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxfcp\" (UniqueName: \"kubernetes.io/projected/9e632286-c5c3-48a0-a79c-445edea3b864-kube-api-access-pxfcp\") pod \"9e632286-c5c3-48a0-a79c-445edea3b864\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.572856 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-horizon-secret-key\") pod \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.572894 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-logs\") pod \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.573097 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgtth\" (UniqueName: \"kubernetes.io/projected/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-kube-api-access-fgtth\") pod \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\" (UID: \"c18627ce-9a31-4f92-b1a1-6a1c63e7c583\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.573145 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-config\") pod \"9e632286-c5c3-48a0-a79c-445edea3b864\" (UID: \"9e632286-c5c3-48a0-a79c-445edea3b864\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.573622 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/372e15f7-0eed-4b38-b4a9-19d3781e6e89-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.573649 4807 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00c37d10-a573-4bfb-8241-ca5c4e9b4396-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.573660 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/372e15f7-0eed-4b38-b4a9-19d3781e6e89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.573669 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvdkk\" (UniqueName: \"kubernetes.io/projected/72652f32-e037-48c8-850e-fb193ad32c5a-kube-api-access-zvdkk\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.573678 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjhq9\" (UniqueName: \"kubernetes.io/projected/00c37d10-a573-4bfb-8241-ca5c4e9b4396-kube-api-access-xjhq9\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.573689 4807 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/72652f32-e037-48c8-850e-fb193ad32c5a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.573702 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx4v7\" (UniqueName: \"kubernetes.io/projected/372e15f7-0eed-4b38-b4a9-19d3781e6e89-kube-api-access-jx4v7\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.574262 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-logs" (OuterVolumeSpecName: "logs") pod "c18627ce-9a31-4f92-b1a1-6a1c63e7c583" (UID: "c18627ce-9a31-4f92-b1a1-6a1c63e7c583"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.575104 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-config-data" (OuterVolumeSpecName: "config-data") pod "c18627ce-9a31-4f92-b1a1-6a1c63e7c583" (UID: "c18627ce-9a31-4f92-b1a1-6a1c63e7c583"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.575209 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-scripts" (OuterVolumeSpecName: "scripts") pod "c18627ce-9a31-4f92-b1a1-6a1c63e7c583" (UID: "c18627ce-9a31-4f92-b1a1-6a1c63e7c583"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.578043 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-kube-api-access-fgtth" (OuterVolumeSpecName: "kube-api-access-fgtth") pod "c18627ce-9a31-4f92-b1a1-6a1c63e7c583" (UID: "c18627ce-9a31-4f92-b1a1-6a1c63e7c583"). InnerVolumeSpecName "kube-api-access-fgtth". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.579131 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e632286-c5c3-48a0-a79c-445edea3b864-kube-api-access-pxfcp" (OuterVolumeSpecName: "kube-api-access-pxfcp") pod "9e632286-c5c3-48a0-a79c-445edea3b864" (UID: "9e632286-c5c3-48a0-a79c-445edea3b864"). InnerVolumeSpecName "kube-api-access-pxfcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.579576 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c18627ce-9a31-4f92-b1a1-6a1c63e7c583" (UID: "c18627ce-9a31-4f92-b1a1-6a1c63e7c583"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.613831 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.620619 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.623393 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9e632286-c5c3-48a0-a79c-445edea3b864" (UID: "9e632286-c5c3-48a0-a79c-445edea3b864"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.637594 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9e632286-c5c3-48a0-a79c-445edea3b864" (UID: "9e632286-c5c3-48a0-a79c-445edea3b864"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.639680 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e632286-c5c3-48a0-a79c-445edea3b864" (UID: "9e632286-c5c3-48a0-a79c-445edea3b864"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.640088 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-config" (OuterVolumeSpecName: "config") pod "9e632286-c5c3-48a0-a79c-445edea3b864" (UID: "9e632286-c5c3-48a0-a79c-445edea3b864"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.664938 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cbfd7dcb-hzflv"] Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.674915 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-config-data\") pod \"7a24c2bd-52fb-46be-9df9-f584530efc32\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675014 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"7a24c2bd-52fb-46be-9df9-f584530efc32\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675045 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-combined-ca-bundle\") pod \"7a24c2bd-52fb-46be-9df9-f584530efc32\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675085 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-combined-ca-bundle\") pod \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675201 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqk2h\" (UniqueName: \"kubernetes.io/projected/7a24c2bd-52fb-46be-9df9-f584530efc32-kube-api-access-nqk2h\") pod \"7a24c2bd-52fb-46be-9df9-f584530efc32\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675244 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-scripts\") pod \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675271 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-public-tls-certs\") pod \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675320 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-httpd-run\") pod \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675345 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-scripts\") pod \"7a24c2bd-52fb-46be-9df9-f584530efc32\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675373 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-config-data\") pod \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675417 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v74l\" (UniqueName: \"kubernetes.io/projected/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-kube-api-access-7v74l\") pod \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675434 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-logs\") pod \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675482 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a24c2bd-52fb-46be-9df9-f584530efc32-logs\") pod \"7a24c2bd-52fb-46be-9df9-f584530efc32\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675507 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a24c2bd-52fb-46be-9df9-f584530efc32-httpd-run\") pod \"7a24c2bd-52fb-46be-9df9-f584530efc32\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675527 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\" (UID: \"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675555 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-internal-tls-certs\") pod \"7a24c2bd-52fb-46be-9df9-f584530efc32\" (UID: \"7a24c2bd-52fb-46be-9df9-f584530efc32\") " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675932 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgtth\" (UniqueName: \"kubernetes.io/projected/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-kube-api-access-fgtth\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675948 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675957 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675967 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675978 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675987 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e632286-c5c3-48a0-a79c-445edea3b864-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.675996 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.676004 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxfcp\" (UniqueName: \"kubernetes.io/projected/9e632286-c5c3-48a0-a79c-445edea3b864-kube-api-access-pxfcp\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.676014 4807 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.676022 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c18627ce-9a31-4f92-b1a1-6a1c63e7c583-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.678760 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" (UID: "fae8c3c5-89bf-4852-b1f9-4423faa2fb8c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.679655 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-logs" (OuterVolumeSpecName: "logs") pod "fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" (UID: "fae8c3c5-89bf-4852-b1f9-4423faa2fb8c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.680300 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a24c2bd-52fb-46be-9df9-f584530efc32-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7a24c2bd-52fb-46be-9df9-f584530efc32" (UID: "7a24c2bd-52fb-46be-9df9-f584530efc32"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.680549 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a24c2bd-52fb-46be-9df9-f584530efc32-logs" (OuterVolumeSpecName: "logs") pod "7a24c2bd-52fb-46be-9df9-f584530efc32" (UID: "7a24c2bd-52fb-46be-9df9-f584530efc32"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.683230 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-scripts" (OuterVolumeSpecName: "scripts") pod "7a24c2bd-52fb-46be-9df9-f584530efc32" (UID: "7a24c2bd-52fb-46be-9df9-f584530efc32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.683788 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a24c2bd-52fb-46be-9df9-f584530efc32-kube-api-access-nqk2h" (OuterVolumeSpecName: "kube-api-access-nqk2h") pod "7a24c2bd-52fb-46be-9df9-f584530efc32" (UID: "7a24c2bd-52fb-46be-9df9-f584530efc32"). InnerVolumeSpecName "kube-api-access-nqk2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.685321 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" (UID: "fae8c3c5-89bf-4852-b1f9-4423faa2fb8c"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.685878 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "7a24c2bd-52fb-46be-9df9-f584530efc32" (UID: "7a24c2bd-52fb-46be-9df9-f584530efc32"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.686783 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-scripts" (OuterVolumeSpecName: "scripts") pod "fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" (UID: "fae8c3c5-89bf-4852-b1f9-4423faa2fb8c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.700109 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-kube-api-access-7v74l" (OuterVolumeSpecName: "kube-api-access-7v74l") pod "fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" (UID: "fae8c3c5-89bf-4852-b1f9-4423faa2fb8c"). InnerVolumeSpecName "kube-api-access-7v74l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.709570 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" (UID: "fae8c3c5-89bf-4852-b1f9-4423faa2fb8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.713908 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a24c2bd-52fb-46be-9df9-f584530efc32" (UID: "7a24c2bd-52fb-46be-9df9-f584530efc32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.740602 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7a24c2bd-52fb-46be-9df9-f584530efc32" (UID: "7a24c2bd-52fb-46be-9df9-f584530efc32"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.741577 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" (UID: "fae8c3c5-89bf-4852-b1f9-4423faa2fb8c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.746941 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-config-data" (OuterVolumeSpecName: "config-data") pod "fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" (UID: "fae8c3c5-89bf-4852-b1f9-4423faa2fb8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.754997 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-config-data" (OuterVolumeSpecName: "config-data") pod "7a24c2bd-52fb-46be-9df9-f584530efc32" (UID: "7a24c2bd-52fb-46be-9df9-f584530efc32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.778781 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.782852 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.782896 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.782907 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqk2h\" (UniqueName: \"kubernetes.io/projected/7a24c2bd-52fb-46be-9df9-f584530efc32-kube-api-access-nqk2h\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.782933 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.782957 4807 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.782967 4807 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.782977 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.782985 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.782994 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v74l\" (UniqueName: \"kubernetes.io/projected/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-kube-api-access-7v74l\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.783005 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.783017 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a24c2bd-52fb-46be-9df9-f584530efc32-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.783026 4807 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a24c2bd-52fb-46be-9df9-f584530efc32-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.783065 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.783076 4807 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.783087 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a24c2bd-52fb-46be-9df9-f584530efc32-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.800881 4807 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.802128 4807 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.840597 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75b4587fb9-xsql7" event={"ID":"72652f32-e037-48c8-850e-fb193ad32c5a","Type":"ContainerDied","Data":"95871a1d60e36fcabbdbab5c33ecd950341a88b52f4bc12f64c1915feb4d1a2c"} Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.840628 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75b4587fb9-xsql7" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.845586 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.845586 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fae8c3c5-89bf-4852-b1f9-4423faa2fb8c","Type":"ContainerDied","Data":"5babe5dc78c315284fabb15287b95a4beea12af146df1bc9594611b5cb769373"} Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.845647 4807 scope.go:117] "RemoveContainer" containerID="0c1915cb9f9da104e3887de5c5ee30cbccb751369c923cac807a484b0025e81c" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.848830 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.848854 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" event={"ID":"9e632286-c5c3-48a0-a79c-445edea3b864","Type":"ContainerDied","Data":"74fe31fead1f31289e2dc91fe484501301fe27c8a1cb5d5ac884a39d5eeeeed2"} Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.858982 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.858980 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a24c2bd-52fb-46be-9df9-f584530efc32","Type":"ContainerDied","Data":"7761fd0a7619808fe85a9c0322f8aaa313ba5821c1e3bab768bfcfb1450b3f55"} Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.861370 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65d7cf6fd9-tcznw" event={"ID":"00c37d10-a573-4bfb-8241-ca5c4e9b4396","Type":"ContainerDied","Data":"8df23e59fd833bd2f88be17306415a171bfb1b3b56d9fd1c0f4d532930e50ee7"} Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.861422 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65d7cf6fd9-tcznw" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.870520 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dbb67c58f-k7fxg" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.870536 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dbb67c58f-k7fxg" event={"ID":"c18627ce-9a31-4f92-b1a1-6a1c63e7c583","Type":"ContainerDied","Data":"d7f1707baebc91958388162394a061ddd8ad3556d981979b136892ca4a3bf436"} Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.873852 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rp8rr" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.873887 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rp8rr" event={"ID":"372e15f7-0eed-4b38-b4a9-19d3781e6e89","Type":"ContainerDied","Data":"6ba0c38a41f65c828651a00d7769697b4e85ca2fb3a44a56595b1d6dd6fd41a2"} Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.873961 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ba0c38a41f65c828651a00d7769697b4e85ca2fb3a44a56595b1d6dd6fd41a2" Dec 02 20:20:56 crc kubenswrapper[4807]: E1202 20:20:56.876932 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-bppsk" podUID="9b83e24a-ce7d-42b1-998f-1ede619914ff" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.889700 4807 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.889747 4807 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.896638 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 20:20:56 crc kubenswrapper[4807]: I1202 20:20:56.906938 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.005246 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" path="/var/lib/kubelet/pods/fae8c3c5-89bf-4852-b1f9-4423faa2fb8c/volumes" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.005988 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 20:20:57 crc kubenswrapper[4807]: E1202 20:20:57.006394 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" containerName="glance-log" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.006407 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" containerName="glance-log" Dec 02 20:20:57 crc kubenswrapper[4807]: E1202 20:20:57.006416 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a24c2bd-52fb-46be-9df9-f584530efc32" containerName="glance-log" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.006423 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a24c2bd-52fb-46be-9df9-f584530efc32" containerName="glance-log" Dec 02 20:20:57 crc kubenswrapper[4807]: E1202 20:20:57.006436 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372e15f7-0eed-4b38-b4a9-19d3781e6e89" containerName="neutron-db-sync" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.006442 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="372e15f7-0eed-4b38-b4a9-19d3781e6e89" containerName="neutron-db-sync" Dec 02 20:20:57 crc kubenswrapper[4807]: E1202 20:20:57.006455 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e632286-c5c3-48a0-a79c-445edea3b864" containerName="init" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.006460 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e632286-c5c3-48a0-a79c-445edea3b864" containerName="init" Dec 02 20:20:57 crc kubenswrapper[4807]: E1202 20:20:57.006471 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e632286-c5c3-48a0-a79c-445edea3b864" containerName="dnsmasq-dns" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.006477 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e632286-c5c3-48a0-a79c-445edea3b864" containerName="dnsmasq-dns" Dec 02 20:20:57 crc kubenswrapper[4807]: E1202 20:20:57.006497 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a24c2bd-52fb-46be-9df9-f584530efc32" containerName="glance-httpd" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.006502 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a24c2bd-52fb-46be-9df9-f584530efc32" containerName="glance-httpd" Dec 02 20:20:57 crc kubenswrapper[4807]: E1202 20:20:57.006516 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" containerName="glance-httpd" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.006522 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" containerName="glance-httpd" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.006709 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" containerName="glance-httpd" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.006738 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae8c3c5-89bf-4852-b1f9-4423faa2fb8c" containerName="glance-log" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.006745 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e632286-c5c3-48a0-a79c-445edea3b864" containerName="dnsmasq-dns" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.006755 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a24c2bd-52fb-46be-9df9-f584530efc32" containerName="glance-httpd" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.006760 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a24c2bd-52fb-46be-9df9-f584530efc32" containerName="glance-log" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.006769 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="372e15f7-0eed-4b38-b4a9-19d3781e6e89" containerName="neutron-db-sync" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.008062 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.011852 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.012339 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-jtsp2" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.012517 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.012661 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.017834 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.047993 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75b4587fb9-xsql7"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.086762 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-75b4587fb9-xsql7"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.096268 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-scripts\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.096384 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttspj\" (UniqueName: \"kubernetes.io/projected/88b698ce-a6a9-4607-8629-394e3a2d7d29-kube-api-access-ttspj\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.096442 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88b698ce-a6a9-4607-8629-394e3a2d7d29-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.096480 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.096543 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.096669 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.096751 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-config-data\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.096774 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b698ce-a6a9-4607-8629-394e3a2d7d29-logs\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.120252 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7dbb67c58f-k7fxg"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.132677 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7dbb67c58f-k7fxg"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.141615 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-tdkbv"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.153632 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-tdkbv"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.163525 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.170288 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.190005 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.191688 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.193439 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.193444 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.200244 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-config-data\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.200277 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b698ce-a6a9-4607-8629-394e3a2d7d29-logs\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.200321 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-scripts\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.200347 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttspj\" (UniqueName: \"kubernetes.io/projected/88b698ce-a6a9-4607-8629-394e3a2d7d29-kube-api-access-ttspj\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.200377 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88b698ce-a6a9-4607-8629-394e3a2d7d29-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.200408 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.200437 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.200514 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.200649 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.201601 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b698ce-a6a9-4607-8629-394e3a2d7d29-logs\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.204478 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88b698ce-a6a9-4607-8629-394e3a2d7d29-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.207739 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.218631 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-scripts\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.219196 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.223148 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttspj\" (UniqueName: \"kubernetes.io/projected/88b698ce-a6a9-4607-8629-394e3a2d7d29-kube-api-access-ttspj\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.223610 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-config-data\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.246881 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.270938 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-65d7cf6fd9-tcznw"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.281479 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.282403 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-65d7cf6fd9-tcznw"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.302802 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-logs\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.302867 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.302910 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt6nq\" (UniqueName: \"kubernetes.io/projected/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-kube-api-access-jt6nq\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.302957 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.303000 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.303019 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.303041 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.303056 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.327943 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.405307 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt6nq\" (UniqueName: \"kubernetes.io/projected/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-kube-api-access-jt6nq\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.405406 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.405477 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.405505 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.405533 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.405554 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.405618 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-logs\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.405657 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.405946 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.407007 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.407255 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-logs\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.410324 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.410918 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.411357 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.411764 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.427343 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt6nq\" (UniqueName: \"kubernetes.io/projected/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-kube-api-access-jt6nq\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.435185 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.613395 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-h5crw"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.614997 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.642241 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-h5crw"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.660609 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.712229 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.720304 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-config\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.720375 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9zjm\" (UniqueName: \"kubernetes.io/projected/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-kube-api-access-m9zjm\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.720399 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.720472 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-dns-svc\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.720739 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.723870 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-779468644-dc9lw"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.725750 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.728769 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.728960 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xw24n" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.729043 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.729180 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.732334 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-779468644-dc9lw"] Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.823602 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.823703 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-ovndb-tls-certs\") pod \"neutron-779468644-dc9lw\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.823814 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.823844 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-config\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.823874 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-httpd-config\") pod \"neutron-779468644-dc9lw\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.823907 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.823933 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9zjm\" (UniqueName: \"kubernetes.io/projected/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-kube-api-access-m9zjm\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.824024 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-dns-svc\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.824093 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-combined-ca-bundle\") pod \"neutron-779468644-dc9lw\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.824121 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtm6r\" (UniqueName: \"kubernetes.io/projected/9f1c6d96-c355-4e05-8823-0f83ad828b3d-kube-api-access-mtm6r\") pod \"neutron-779468644-dc9lw\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.824171 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-config\") pod \"neutron-779468644-dc9lw\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.828942 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-dns-svc\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.829064 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-config\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.830105 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.833446 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.836588 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.850988 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9zjm\" (UniqueName: \"kubernetes.io/projected/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-kube-api-access-m9zjm\") pod \"dnsmasq-dns-6b7b667979-h5crw\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.927863 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-httpd-config\") pod \"neutron-779468644-dc9lw\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.927963 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-combined-ca-bundle\") pod \"neutron-779468644-dc9lw\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.927983 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtm6r\" (UniqueName: \"kubernetes.io/projected/9f1c6d96-c355-4e05-8823-0f83ad828b3d-kube-api-access-mtm6r\") pod \"neutron-779468644-dc9lw\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.928018 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-config\") pod \"neutron-779468644-dc9lw\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.928060 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-ovndb-tls-certs\") pod \"neutron-779468644-dc9lw\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.940596 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.946658 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-combined-ca-bundle\") pod \"neutron-779468644-dc9lw\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.949538 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-httpd-config\") pod \"neutron-779468644-dc9lw\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.950102 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-ovndb-tls-certs\") pod \"neutron-779468644-dc9lw\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.957510 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-config\") pod \"neutron-779468644-dc9lw\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:57 crc kubenswrapper[4807]: I1202 20:20:57.967631 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtm6r\" (UniqueName: \"kubernetes.io/projected/9f1c6d96-c355-4e05-8823-0f83ad828b3d-kube-api-access-mtm6r\") pod \"neutron-779468644-dc9lw\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:58 crc kubenswrapper[4807]: I1202 20:20:58.052116 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-779468644-dc9lw" Dec 02 20:20:58 crc kubenswrapper[4807]: I1202 20:20:58.293062 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:20:58 crc kubenswrapper[4807]: I1202 20:20:58.293394 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:20:58 crc kubenswrapper[4807]: I1202 20:20:58.293441 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 20:20:58 crc kubenswrapper[4807]: I1202 20:20:58.294248 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13f662bee9488997accc2766f7577233b423e1195e39b433e12fc85d986a041b"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 20:20:58 crc kubenswrapper[4807]: I1202 20:20:58.294301 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://13f662bee9488997accc2766f7577233b423e1195e39b433e12fc85d986a041b" gracePeriod=600 Dec 02 20:20:58 crc kubenswrapper[4807]: I1202 20:20:58.990503 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00c37d10-a573-4bfb-8241-ca5c4e9b4396" path="/var/lib/kubelet/pods/00c37d10-a573-4bfb-8241-ca5c4e9b4396/volumes" Dec 02 20:20:58 crc kubenswrapper[4807]: I1202 20:20:58.991112 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72652f32-e037-48c8-850e-fb193ad32c5a" path="/var/lib/kubelet/pods/72652f32-e037-48c8-850e-fb193ad32c5a/volumes" Dec 02 20:20:58 crc kubenswrapper[4807]: I1202 20:20:58.991645 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a24c2bd-52fb-46be-9df9-f584530efc32" path="/var/lib/kubelet/pods/7a24c2bd-52fb-46be-9df9-f584530efc32/volumes" Dec 02 20:20:58 crc kubenswrapper[4807]: I1202 20:20:58.992671 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e632286-c5c3-48a0-a79c-445edea3b864" path="/var/lib/kubelet/pods/9e632286-c5c3-48a0-a79c-445edea3b864/volumes" Dec 02 20:20:58 crc kubenswrapper[4807]: I1202 20:20:58.994287 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c18627ce-9a31-4f92-b1a1-6a1c63e7c583" path="/var/lib/kubelet/pods/c18627ce-9a31-4f92-b1a1-6a1c63e7c583/volumes" Dec 02 20:20:59 crc kubenswrapper[4807]: E1202 20:20:59.647168 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 02 20:20:59 crc kubenswrapper[4807]: E1202 20:20:59.647597 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7zfcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-zlf57_openstack(2b4b9175-26ae-4cff-8dd2-7682b1408271): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 20:20:59 crc kubenswrapper[4807]: E1202 20:20:59.649091 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-zlf57" podUID="2b4b9175-26ae-4cff-8dd2-7682b1408271" Dec 02 20:20:59 crc kubenswrapper[4807]: I1202 20:20:59.797759 4807 scope.go:117] "RemoveContainer" containerID="a19c47ea0b69774ecca4da191b8a01fcd53207335be53695ca15985b92d97b05" Dec 02 20:20:59 crc kubenswrapper[4807]: I1202 20:20:59.971330 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="13f662bee9488997accc2766f7577233b423e1195e39b433e12fc85d986a041b" exitCode=0 Dec 02 20:20:59 crc kubenswrapper[4807]: I1202 20:20:59.971679 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"13f662bee9488997accc2766f7577233b423e1195e39b433e12fc85d986a041b"} Dec 02 20:20:59 crc kubenswrapper[4807]: I1202 20:20:59.980553 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cbfd7dcb-hzflv" event={"ID":"f5570109-9e91-473c-8a41-47081ace3591","Type":"ContainerStarted","Data":"04d0a11e1f4a159693b1aabb45ba5cd1ef50a1bffb10ed8fcd5478770aae5033"} Dec 02 20:20:59 crc kubenswrapper[4807]: E1202 20:20:59.995964 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-zlf57" podUID="2b4b9175-26ae-4cff-8dd2-7682b1408271" Dec 02 20:21:00 crc kubenswrapper[4807]: I1202 20:21:00.031338 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-tdkbv" podUID="9e632286-c5c3-48a0-a79c-445edea3b864" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: i/o timeout" Dec 02 20:21:00 crc kubenswrapper[4807]: I1202 20:21:00.100273 4807 scope.go:117] "RemoveContainer" containerID="66603f698908783c8443c16b6ab879e122ba6a4191f63cfc19f612f4335ef295" Dec 02 20:21:00 crc kubenswrapper[4807]: I1202 20:21:00.192387 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cc8fc8c44-b8pmd"] Dec 02 20:21:00 crc kubenswrapper[4807]: I1202 20:21:00.198575 4807 scope.go:117] "RemoveContainer" containerID="745648b60cf2543ae1e6dffbb9bcdb776ae860287454a955f890a3604dca454d" Dec 02 20:21:00 crc kubenswrapper[4807]: I1202 20:21:00.336246 4807 scope.go:117] "RemoveContainer" containerID="bd26f685403ee381d454ce4d67e7852ce931899d2fa15b2adb729b2a07c06f12" Dec 02 20:21:00 crc kubenswrapper[4807]: I1202 20:21:00.695595 4807 scope.go:117] "RemoveContainer" containerID="4299539d7a41be88d46f14169b00a873041a1554e409cabc356b79f88ca75629" Dec 02 20:21:00 crc kubenswrapper[4807]: I1202 20:21:00.745496 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vm4lf"] Dec 02 20:21:00 crc kubenswrapper[4807]: I1202 20:21:00.774049 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 20:21:00 crc kubenswrapper[4807]: I1202 20:21:00.945866 4807 scope.go:117] "RemoveContainer" containerID="38f5416dfc921b3e8d35befa1ba790fe16c356edf6ecbb6687773608446c2497" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.039328 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.039957 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-h5crw"] Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.044969 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.061805 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc8fc8c44-b8pmd" event={"ID":"62551774-7dc7-4727-a79d-f92d4f82d560","Type":"ContainerStarted","Data":"ef4c3926c332bcd00dcf971b4776055b08fc6d5c025e9fcea4405cf8b54d214f"} Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.086146 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.143224 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129"} Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.164348 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vm4lf" event={"ID":"9af9ee27-367c-4051-95bd-78ede0827b19","Type":"ContainerStarted","Data":"dd181bdbe34075b663ee53f797ed59f375ede14eecdb3a0cda125b2d123863b2"} Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.183017 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fc7d360-e78a-4042-98e1-f8a2b97c10ab","Type":"ContainerStarted","Data":"68014bcc35f5cefa43ab4472ad58faa656b225d8d308e86a6303b1b2499aa487"} Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.215993 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tgqtm" event={"ID":"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe","Type":"ContainerStarted","Data":"7c3bdef80b4682d971604e3dd3033e145a226709e1bb76a18169004f91d6b88e"} Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.254146 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5597979745-dn972"] Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.257355 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.263747 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5597979745-dn972"] Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.263816 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.263926 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.267637 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vm4lf" podStartSLOduration=24.267610937 podStartE2EDuration="24.267610937s" podCreationTimestamp="2025-12-02 20:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:01.217183196 +0000 UTC m=+1396.518090691" watchObservedRunningTime="2025-12-02 20:21:01.267610937 +0000 UTC m=+1396.568518432" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.290116 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-tgqtm" podStartSLOduration=8.510171548 podStartE2EDuration="40.290087332s" podCreationTimestamp="2025-12-02 20:20:21 +0000 UTC" firstStartedPulling="2025-12-02 20:20:24.420183572 +0000 UTC m=+1359.721091067" lastFinishedPulling="2025-12-02 20:20:56.200099336 +0000 UTC m=+1391.501006851" observedRunningTime="2025-12-02 20:21:01.254490049 +0000 UTC m=+1396.555397544" watchObservedRunningTime="2025-12-02 20:21:01.290087332 +0000 UTC m=+1396.590994817" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.351953 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j52tv\" (UniqueName: \"kubernetes.io/projected/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-kube-api-access-j52tv\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.352284 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-combined-ca-bundle\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.352407 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-public-tls-certs\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.355708 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-config\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.355930 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-ovndb-tls-certs\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.356010 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-internal-tls-certs\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.356344 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-httpd-config\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.461782 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j52tv\" (UniqueName: \"kubernetes.io/projected/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-kube-api-access-j52tv\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.461861 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-combined-ca-bundle\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.462072 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-public-tls-certs\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.462269 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-config\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.462332 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-ovndb-tls-certs\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.462384 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-internal-tls-certs\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.463769 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-httpd-config\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.477974 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-combined-ca-bundle\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.484830 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-public-tls-certs\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.485566 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-httpd-config\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.491473 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-config\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.497773 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-ovndb-tls-certs\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.498682 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-internal-tls-certs\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.504928 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j52tv\" (UniqueName: \"kubernetes.io/projected/a98ea655-0ec2-4d0f-951a-57f5ee9f6df2-kube-api-access-j52tv\") pod \"neutron-5597979745-dn972\" (UID: \"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2\") " pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.589520 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 20:21:01 crc kubenswrapper[4807]: I1202 20:21:01.602438 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.245442 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-779468644-dc9lw"] Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.334294 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"4e60dc88-91d1-4325-9832-f9a921502710","Type":"ContainerStarted","Data":"3d3746cae99fa622248b8c25be00e5cea563131189657f26fd383333bb95fa2f"} Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.334385 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"4e60dc88-91d1-4325-9832-f9a921502710","Type":"ContainerStarted","Data":"9bc8ba85975bbbc26b9ff4c698a8b1d9950252e39bb81c90774aedd118d44b21"} Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.334395 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"4e60dc88-91d1-4325-9832-f9a921502710","Type":"ContainerStarted","Data":"a1504e7928460dd554f17357b52f2d520bd8f14faaf4412ab17c1d0d008852f9"} Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.339926 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.352963 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="4e60dc88-91d1-4325-9832-f9a921502710" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.162:9322/\": dial tcp 10.217.0.162:9322: connect: connection refused" Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.380699 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"88b698ce-a6a9-4607-8629-394e3a2d7d29","Type":"ContainerStarted","Data":"8a2e324022adecb3f865ca1d1068e038c33dd749abc8ee5073fd03b44e088596"} Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.394251 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.423223 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=16.423186607 podStartE2EDuration="16.423186607s" podCreationTimestamp="2025-12-02 20:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:02.398121266 +0000 UTC m=+1397.699028761" watchObservedRunningTime="2025-12-02 20:21:02.423186607 +0000 UTC m=+1397.724094102" Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.475635 4807 generic.go:334] "Generic (PLEG): container finished" podID="ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" containerID="5a07ee4176ba65fa0a6bdb417a528468f20d7f33c9a1db630d3deb5b84f7e191" exitCode=0 Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.475737 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-h5crw" event={"ID":"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc","Type":"ContainerDied","Data":"5a07ee4176ba65fa0a6bdb417a528468f20d7f33c9a1db630d3deb5b84f7e191"} Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.475768 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-h5crw" event={"ID":"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc","Type":"ContainerStarted","Data":"407cf19254ac1fd6dab9bb853d1022efa30f6216923c9107997a86fef00f3540"} Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.475835 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5597979745-dn972"] Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.486785 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cbfd7dcb-hzflv" event={"ID":"f5570109-9e91-473c-8a41-47081ace3591","Type":"ContainerStarted","Data":"a5b9f9df4178ab2f7a01733e7ecf42c37c8df519d598c08c11cc3ca55a100f1a"} Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.486835 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cbfd7dcb-hzflv" event={"ID":"f5570109-9e91-473c-8a41-47081ace3591","Type":"ContainerStarted","Data":"5b7c7ae1a30a0524bb93e7662c35e42f371e68c45e661e1bcc95bc20fa1c78bb"} Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.526675 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5cbfd7dcb-hzflv" podStartSLOduration=29.477209783 podStartE2EDuration="30.526652946s" podCreationTimestamp="2025-12-02 20:20:32 +0000 UTC" firstStartedPulling="2025-12-02 20:20:59.699942545 +0000 UTC m=+1395.000850030" lastFinishedPulling="2025-12-02 20:21:00.749385698 +0000 UTC m=+1396.050293193" observedRunningTime="2025-12-02 20:21:02.525354067 +0000 UTC m=+1397.826261582" watchObservedRunningTime="2025-12-02 20:21:02.526652946 +0000 UTC m=+1397.827560441" Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.542286 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vm4lf" event={"ID":"9af9ee27-367c-4051-95bd-78ede0827b19","Type":"ContainerStarted","Data":"056b97430d9365f074a446ca366ddf09e067626066666459ccce07d819f95502"} Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.549691 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"95945b94-12eb-4077-b233-45c5c2b6b51d","Type":"ContainerStarted","Data":"edc02ce2c1fe47105e66528d22ad724c1e9087d8c9124f05de80f5c3573abba4"} Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.557832 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc8fc8c44-b8pmd" event={"ID":"62551774-7dc7-4727-a79d-f92d4f82d560","Type":"ContainerStarted","Data":"1a2bc25490b727e95a676dc6e738dd9535b3f9346206cb4c536bfdd80fd32d19"} Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.557899 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc8fc8c44-b8pmd" event={"ID":"62551774-7dc7-4727-a79d-f92d4f82d560","Type":"ContainerStarted","Data":"acc5eee2fa3e2e060eea697bd28253234491412bf2a4bb3353d3abe041d69fa2"} Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.563253 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"1d68c545-0435-4f66-a351-3ccba6fa68a3","Type":"ContainerStarted","Data":"139aee39e9156b3bed4bf3f186b850853daa85122f43c93360ba5a1bc97d5fef"} Dec 02 20:21:02 crc kubenswrapper[4807]: I1202 20:21:02.588808 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7cc8fc8c44-b8pmd" podStartSLOduration=31.070324016 podStartE2EDuration="31.58869773s" podCreationTimestamp="2025-12-02 20:20:31 +0000 UTC" firstStartedPulling="2025-12-02 20:21:00.352688751 +0000 UTC m=+1395.653596246" lastFinishedPulling="2025-12-02 20:21:00.871062465 +0000 UTC m=+1396.171969960" observedRunningTime="2025-12-02 20:21:02.579422286 +0000 UTC m=+1397.880329781" watchObservedRunningTime="2025-12-02 20:21:02.58869773 +0000 UTC m=+1397.889605225" Dec 02 20:21:03 crc kubenswrapper[4807]: I1202 20:21:03.577264 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"88b698ce-a6a9-4607-8629-394e3a2d7d29","Type":"ContainerStarted","Data":"5da3672237909f2b76e84812636ed8d7d8ec3f5487f44ca0397dbb4ca09ef4da"} Dec 02 20:21:03 crc kubenswrapper[4807]: I1202 20:21:03.580379 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62","Type":"ContainerStarted","Data":"6a5c6268c2ed72948f67dfc5e85bcfe50696e07d236efc57ddb6c07a6e20e3e5"} Dec 02 20:21:03 crc kubenswrapper[4807]: I1202 20:21:03.584219 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-779468644-dc9lw" event={"ID":"9f1c6d96-c355-4e05-8823-0f83ad828b3d","Type":"ContainerStarted","Data":"09f9a52eda291674f8c6be27d86f7266ff9ac29b17474f0a591702b419e1e6c4"} Dec 02 20:21:03 crc kubenswrapper[4807]: I1202 20:21:03.584572 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-779468644-dc9lw" event={"ID":"9f1c6d96-c355-4e05-8823-0f83ad828b3d","Type":"ContainerStarted","Data":"f0b739c61fcefa1306e3cea6c38def130142d0b973363fe4b66f1422a3d7f635"} Dec 02 20:21:03 crc kubenswrapper[4807]: I1202 20:21:03.589568 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5597979745-dn972" event={"ID":"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2","Type":"ContainerStarted","Data":"c79ca85fcbfd10cde6e1fc97fc76c364c6fc60e5f3b565519c44d3c1fcde8e3e"} Dec 02 20:21:05 crc kubenswrapper[4807]: I1202 20:21:05.643737 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-h5crw" event={"ID":"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc","Type":"ContainerStarted","Data":"00aa61c4b03d62296708ec8fcf10db66f94b54f6c5c75bd017f7ad7fc374abec"} Dec 02 20:21:05 crc kubenswrapper[4807]: I1202 20:21:05.646273 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5597979745-dn972" event={"ID":"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2","Type":"ContainerStarted","Data":"f84effe27a2094800635a55a7cc63d73d36d101a5384cccade63d0fd11d6b888"} Dec 02 20:21:05 crc kubenswrapper[4807]: I1202 20:21:05.867580 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 02 20:21:06 crc kubenswrapper[4807]: I1202 20:21:06.658906 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62","Type":"ContainerStarted","Data":"af444d5027b97215d0398e19787af2420a5406dd8f3f33a00bd64ff2801c5918"} Dec 02 20:21:06 crc kubenswrapper[4807]: I1202 20:21:06.659341 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:21:07 crc kubenswrapper[4807]: I1202 20:21:07.276563 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Dec 02 20:21:07 crc kubenswrapper[4807]: I1202 20:21:07.276915 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 02 20:21:07 crc kubenswrapper[4807]: I1202 20:21:07.289256 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Dec 02 20:21:07 crc kubenswrapper[4807]: I1202 20:21:07.317520 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-h5crw" podStartSLOduration=10.317498557 podStartE2EDuration="10.317498557s" podCreationTimestamp="2025-12-02 20:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:06.680973772 +0000 UTC m=+1401.981881287" watchObservedRunningTime="2025-12-02 20:21:07.317498557 +0000 UTC m=+1402.618406052" Dec 02 20:21:07 crc kubenswrapper[4807]: I1202 20:21:07.692766 4807 generic.go:334] "Generic (PLEG): container finished" podID="9af9ee27-367c-4051-95bd-78ede0827b19" containerID="056b97430d9365f074a446ca366ddf09e067626066666459ccce07d819f95502" exitCode=0 Dec 02 20:21:07 crc kubenswrapper[4807]: I1202 20:21:07.692878 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vm4lf" event={"ID":"9af9ee27-367c-4051-95bd-78ede0827b19","Type":"ContainerDied","Data":"056b97430d9365f074a446ca366ddf09e067626066666459ccce07d819f95502"} Dec 02 20:21:07 crc kubenswrapper[4807]: I1202 20:21:07.697103 4807 generic.go:334] "Generic (PLEG): container finished" podID="7183d5bc-237d-4fd9-8d9c-31ccc6c46afe" containerID="7c3bdef80b4682d971604e3dd3033e145a226709e1bb76a18169004f91d6b88e" exitCode=0 Dec 02 20:21:07 crc kubenswrapper[4807]: I1202 20:21:07.697987 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tgqtm" event={"ID":"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe","Type":"ContainerDied","Data":"7c3bdef80b4682d971604e3dd3033e145a226709e1bb76a18169004f91d6b88e"} Dec 02 20:21:07 crc kubenswrapper[4807]: I1202 20:21:07.704802 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 02 20:21:07 crc kubenswrapper[4807]: I1202 20:21:07.797440 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 20:21:08 crc kubenswrapper[4807]: I1202 20:21:08.730527 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"88b698ce-a6a9-4607-8629-394e3a2d7d29","Type":"ContainerStarted","Data":"5923f4a532f99d9911076926bb80bbf7974024d3d16e2323a9861629b0725eac"} Dec 02 20:21:08 crc kubenswrapper[4807]: I1202 20:21:08.741667 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62","Type":"ContainerStarted","Data":"a1234d2f9bbea41e51f17c78fe72a9d053c485e8d7dd5e87d80b7a256d014711"} Dec 02 20:21:08 crc kubenswrapper[4807]: I1202 20:21:08.748675 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"1d68c545-0435-4f66-a351-3ccba6fa68a3","Type":"ContainerStarted","Data":"f650ae7313ef1f716ee157f0ea5ad9055a7c2a663f080a89ace828559481b192"} Dec 02 20:21:08 crc kubenswrapper[4807]: I1202 20:21:08.761983 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-779468644-dc9lw" event={"ID":"9f1c6d96-c355-4e05-8823-0f83ad828b3d","Type":"ContainerStarted","Data":"385a45443efe28feba739e07293b4f8074e2dff09ff876a0db4ad5b92e9c4db5"} Dec 02 20:21:08 crc kubenswrapper[4807]: I1202 20:21:08.762187 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-779468644-dc9lw" Dec 02 20:21:08 crc kubenswrapper[4807]: I1202 20:21:08.770974 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=12.770947183 podStartE2EDuration="12.770947183s" podCreationTimestamp="2025-12-02 20:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:08.753698963 +0000 UTC m=+1404.054606448" watchObservedRunningTime="2025-12-02 20:21:08.770947183 +0000 UTC m=+1404.071854678" Dec 02 20:21:08 crc kubenswrapper[4807]: I1202 20:21:08.771136 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5597979745-dn972" event={"ID":"a98ea655-0ec2-4d0f-951a-57f5ee9f6df2","Type":"ContainerStarted","Data":"45acc6565d6f13e113f323eb04076dd7c86a653840e2a56b88f19a10653a7991"} Dec 02 20:21:08 crc kubenswrapper[4807]: I1202 20:21:08.772192 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:08 crc kubenswrapper[4807]: I1202 20:21:08.778075 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fc7d360-e78a-4042-98e1-f8a2b97c10ab","Type":"ContainerStarted","Data":"151a5b20e12ef473a2ac64a0c900d10c72e66ef7e2f8c1288e3e6b8a6e8bdf9f"} Dec 02 20:21:08 crc kubenswrapper[4807]: I1202 20:21:08.780840 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"95945b94-12eb-4077-b233-45c5c2b6b51d","Type":"ContainerStarted","Data":"c87c2f0302a130cf41abe6822b2865ba1dc27741a63bce4c74e6b79bbbf2b18f"} Dec 02 20:21:08 crc kubenswrapper[4807]: I1202 20:21:08.787921 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=11.787899694 podStartE2EDuration="11.787899694s" podCreationTimestamp="2025-12-02 20:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:08.785102441 +0000 UTC m=+1404.086009956" watchObservedRunningTime="2025-12-02 20:21:08.787899694 +0000 UTC m=+1404.088807189" Dec 02 20:21:08 crc kubenswrapper[4807]: I1202 20:21:08.824810 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=16.623048525 podStartE2EDuration="22.824783654s" podCreationTimestamp="2025-12-02 20:20:46 +0000 UTC" firstStartedPulling="2025-12-02 20:21:01.061287058 +0000 UTC m=+1396.362194553" lastFinishedPulling="2025-12-02 20:21:07.263022187 +0000 UTC m=+1402.563929682" observedRunningTime="2025-12-02 20:21:08.812574304 +0000 UTC m=+1404.113481799" watchObservedRunningTime="2025-12-02 20:21:08.824783654 +0000 UTC m=+1404.125691149" Dec 02 20:21:08 crc kubenswrapper[4807]: I1202 20:21:08.835703 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-779468644-dc9lw" podStartSLOduration=11.835676037 podStartE2EDuration="11.835676037s" podCreationTimestamp="2025-12-02 20:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:08.833944685 +0000 UTC m=+1404.134852180" watchObservedRunningTime="2025-12-02 20:21:08.835676037 +0000 UTC m=+1404.136583532" Dec 02 20:21:08 crc kubenswrapper[4807]: I1202 20:21:08.860137 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=16.658491244 podStartE2EDuration="22.860105359s" podCreationTimestamp="2025-12-02 20:20:46 +0000 UTC" firstStartedPulling="2025-12-02 20:21:01.061582927 +0000 UTC m=+1396.362490422" lastFinishedPulling="2025-12-02 20:21:07.263197042 +0000 UTC m=+1402.564104537" observedRunningTime="2025-12-02 20:21:08.859177231 +0000 UTC m=+1404.160084726" watchObservedRunningTime="2025-12-02 20:21:08.860105359 +0000 UTC m=+1404.161012854" Dec 02 20:21:08 crc kubenswrapper[4807]: I1202 20:21:08.883592 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5597979745-dn972" podStartSLOduration=7.883568932 podStartE2EDuration="7.883568932s" podCreationTimestamp="2025-12-02 20:21:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:08.882564433 +0000 UTC m=+1404.183471928" watchObservedRunningTime="2025-12-02 20:21:08.883568932 +0000 UTC m=+1404.184476427" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.349187 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.473880 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tgqtm" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.521500 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-credential-keys\") pod \"9af9ee27-367c-4051-95bd-78ede0827b19\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.521679 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-combined-ca-bundle\") pod \"9af9ee27-367c-4051-95bd-78ede0827b19\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.521777 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-config-data\") pod \"9af9ee27-367c-4051-95bd-78ede0827b19\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.521876 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc89d\" (UniqueName: \"kubernetes.io/projected/9af9ee27-367c-4051-95bd-78ede0827b19-kube-api-access-bc89d\") pod \"9af9ee27-367c-4051-95bd-78ede0827b19\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.523299 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-scripts\") pod \"9af9ee27-367c-4051-95bd-78ede0827b19\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.523370 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-fernet-keys\") pod \"9af9ee27-367c-4051-95bd-78ede0827b19\" (UID: \"9af9ee27-367c-4051-95bd-78ede0827b19\") " Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.533987 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af9ee27-367c-4051-95bd-78ede0827b19-kube-api-access-bc89d" (OuterVolumeSpecName: "kube-api-access-bc89d") pod "9af9ee27-367c-4051-95bd-78ede0827b19" (UID: "9af9ee27-367c-4051-95bd-78ede0827b19"). InnerVolumeSpecName "kube-api-access-bc89d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.535048 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9af9ee27-367c-4051-95bd-78ede0827b19" (UID: "9af9ee27-367c-4051-95bd-78ede0827b19"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.535181 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9af9ee27-367c-4051-95bd-78ede0827b19" (UID: "9af9ee27-367c-4051-95bd-78ede0827b19"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.550898 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-scripts" (OuterVolumeSpecName: "scripts") pod "9af9ee27-367c-4051-95bd-78ede0827b19" (UID: "9af9ee27-367c-4051-95bd-78ede0827b19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.582098 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-config-data" (OuterVolumeSpecName: "config-data") pod "9af9ee27-367c-4051-95bd-78ede0827b19" (UID: "9af9ee27-367c-4051-95bd-78ede0827b19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.594580 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9af9ee27-367c-4051-95bd-78ede0827b19" (UID: "9af9ee27-367c-4051-95bd-78ede0827b19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.627659 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-config-data\") pod \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.627738 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-scripts\") pod \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.627799 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-logs\") pod \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.627843 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bv64\" (UniqueName: \"kubernetes.io/projected/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-kube-api-access-5bv64\") pod \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.627887 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-combined-ca-bundle\") pod \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\" (UID: \"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe\") " Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.628437 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc89d\" (UniqueName: \"kubernetes.io/projected/9af9ee27-367c-4051-95bd-78ede0827b19-kube-api-access-bc89d\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.628458 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.628470 4807 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.628481 4807 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.628492 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.628502 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9af9ee27-367c-4051-95bd-78ede0827b19-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.628889 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-logs" (OuterVolumeSpecName: "logs") pod "7183d5bc-237d-4fd9-8d9c-31ccc6c46afe" (UID: "7183d5bc-237d-4fd9-8d9c-31ccc6c46afe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.632031 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-scripts" (OuterVolumeSpecName: "scripts") pod "7183d5bc-237d-4fd9-8d9c-31ccc6c46afe" (UID: "7183d5bc-237d-4fd9-8d9c-31ccc6c46afe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.632833 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-kube-api-access-5bv64" (OuterVolumeSpecName: "kube-api-access-5bv64") pod "7183d5bc-237d-4fd9-8d9c-31ccc6c46afe" (UID: "7183d5bc-237d-4fd9-8d9c-31ccc6c46afe"). InnerVolumeSpecName "kube-api-access-5bv64". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.655949 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-config-data" (OuterVolumeSpecName: "config-data") pod "7183d5bc-237d-4fd9-8d9c-31ccc6c46afe" (UID: "7183d5bc-237d-4fd9-8d9c-31ccc6c46afe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.670193 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7183d5bc-237d-4fd9-8d9c-31ccc6c46afe" (UID: "7183d5bc-237d-4fd9-8d9c-31ccc6c46afe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.729830 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.730050 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.730139 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.730201 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bv64\" (UniqueName: \"kubernetes.io/projected/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-kube-api-access-5bv64\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.730258 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.805602 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vm4lf" event={"ID":"9af9ee27-367c-4051-95bd-78ede0827b19","Type":"ContainerDied","Data":"dd181bdbe34075b663ee53f797ed59f375ede14eecdb3a0cda125b2d123863b2"} Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.805861 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd181bdbe34075b663ee53f797ed59f375ede14eecdb3a0cda125b2d123863b2" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.805926 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vm4lf" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.818833 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-tgqtm" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.821305 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-tgqtm" event={"ID":"7183d5bc-237d-4fd9-8d9c-31ccc6c46afe","Type":"ContainerDied","Data":"8ab2770ddd8a6d4b5e272d8fc37544e9782576f8be00e8997887ac7de3247cf9"} Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.821354 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ab2770ddd8a6d4b5e272d8fc37544e9782576f8be00e8997887ac7de3247cf9" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.862508 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6d4cd98589-bnbn5"] Dec 02 20:21:09 crc kubenswrapper[4807]: E1202 20:21:09.863034 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af9ee27-367c-4051-95bd-78ede0827b19" containerName="keystone-bootstrap" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.863052 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af9ee27-367c-4051-95bd-78ede0827b19" containerName="keystone-bootstrap" Dec 02 20:21:09 crc kubenswrapper[4807]: E1202 20:21:09.863063 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7183d5bc-237d-4fd9-8d9c-31ccc6c46afe" containerName="placement-db-sync" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.863088 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7183d5bc-237d-4fd9-8d9c-31ccc6c46afe" containerName="placement-db-sync" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.863304 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9af9ee27-367c-4051-95bd-78ede0827b19" containerName="keystone-bootstrap" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.863337 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7183d5bc-237d-4fd9-8d9c-31ccc6c46afe" containerName="placement-db-sync" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.864623 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.876179 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-psn74" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.876593 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.876748 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.876942 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.876962 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.881750 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 20:21:09 crc kubenswrapper[4807]: I1202 20:21:09.891629 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d4cd98589-bnbn5"] Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.036489 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-844d66f984-gvswh"] Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.047394 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jft2t\" (UniqueName: \"kubernetes.io/projected/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-kube-api-access-jft2t\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.047561 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-config-data\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.047784 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-credential-keys\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.047851 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-internal-tls-certs\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.047903 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-combined-ca-bundle\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.047967 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-scripts\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.048042 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-public-tls-certs\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.048112 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-fernet-keys\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.065867 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.070986 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.071105 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.073080 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.073350 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.079466 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-mh454" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.122319 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-844d66f984-gvswh"] Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.152876 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-public-tls-certs\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.153108 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2gqm\" (UniqueName: \"kubernetes.io/projected/cd577cf0-d4de-4a57-9254-8a7bf61aa686-kube-api-access-g2gqm\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.153206 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd577cf0-d4de-4a57-9254-8a7bf61aa686-internal-tls-certs\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.153332 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-fernet-keys\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.153564 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd577cf0-d4de-4a57-9254-8a7bf61aa686-config-data\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.153804 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jft2t\" (UniqueName: \"kubernetes.io/projected/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-kube-api-access-jft2t\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.153926 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-config-data\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.154038 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd577cf0-d4de-4a57-9254-8a7bf61aa686-public-tls-certs\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.154302 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-credential-keys\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.154368 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd577cf0-d4de-4a57-9254-8a7bf61aa686-logs\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.154396 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-internal-tls-certs\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.154435 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd577cf0-d4de-4a57-9254-8a7bf61aa686-combined-ca-bundle\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.154458 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-combined-ca-bundle\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.157195 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-scripts\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.157356 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd577cf0-d4de-4a57-9254-8a7bf61aa686-scripts\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.159648 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-internal-tls-certs\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.164228 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-public-tls-certs\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.180847 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jft2t\" (UniqueName: \"kubernetes.io/projected/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-kube-api-access-jft2t\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.197936 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-scripts\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.198841 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-combined-ca-bundle\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.202467 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-credential-keys\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.202542 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-fernet-keys\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.204597 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcde4ab3-e62a-40bc-86b7-6d1c5e1af116-config-data\") pod \"keystone-6d4cd98589-bnbn5\" (UID: \"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116\") " pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.259637 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2gqm\" (UniqueName: \"kubernetes.io/projected/cd577cf0-d4de-4a57-9254-8a7bf61aa686-kube-api-access-g2gqm\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.259687 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd577cf0-d4de-4a57-9254-8a7bf61aa686-internal-tls-certs\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.259762 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd577cf0-d4de-4a57-9254-8a7bf61aa686-config-data\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.259825 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd577cf0-d4de-4a57-9254-8a7bf61aa686-public-tls-certs\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.259872 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd577cf0-d4de-4a57-9254-8a7bf61aa686-logs\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.259896 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd577cf0-d4de-4a57-9254-8a7bf61aa686-combined-ca-bundle\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.259929 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd577cf0-d4de-4a57-9254-8a7bf61aa686-scripts\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.260889 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd577cf0-d4de-4a57-9254-8a7bf61aa686-logs\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.271762 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd577cf0-d4de-4a57-9254-8a7bf61aa686-combined-ca-bundle\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.274037 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd577cf0-d4de-4a57-9254-8a7bf61aa686-public-tls-certs\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.276591 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd577cf0-d4de-4a57-9254-8a7bf61aa686-scripts\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.277429 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd577cf0-d4de-4a57-9254-8a7bf61aa686-internal-tls-certs\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.290309 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd577cf0-d4de-4a57-9254-8a7bf61aa686-config-data\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.305217 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2gqm\" (UniqueName: \"kubernetes.io/projected/cd577cf0-d4de-4a57-9254-8a7bf61aa686-kube-api-access-g2gqm\") pod \"placement-844d66f984-gvswh\" (UID: \"cd577cf0-d4de-4a57-9254-8a7bf61aa686\") " pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.454661 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:10 crc kubenswrapper[4807]: I1202 20:21:10.490339 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:11 crc kubenswrapper[4807]: I1202 20:21:11.126237 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-844d66f984-gvswh"] Dec 02 20:21:11 crc kubenswrapper[4807]: I1202 20:21:11.224068 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d4cd98589-bnbn5"] Dec 02 20:21:11 crc kubenswrapper[4807]: I1202 20:21:11.858282 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-844d66f984-gvswh" event={"ID":"cd577cf0-d4de-4a57-9254-8a7bf61aa686","Type":"ContainerStarted","Data":"1b49c9ca5227ac81a163d91d9c7e37ff4ba6ab59a633a157bd25d270965c8c2b"} Dec 02 20:21:11 crc kubenswrapper[4807]: I1202 20:21:11.863543 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d4cd98589-bnbn5" event={"ID":"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116","Type":"ContainerStarted","Data":"227773a2f00ec32c7604c173f0d80b51a8c0885c2192bca05cc09e0a9e8c6fc7"} Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.105548 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.105917 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="4e60dc88-91d1-4325-9832-f9a921502710" containerName="watcher-api-log" containerID="cri-o://9bc8ba85975bbbc26b9ff4c698a8b1d9950252e39bb81c90774aedd118d44b21" gracePeriod=30 Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.106151 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="4e60dc88-91d1-4325-9832-f9a921502710" containerName="watcher-api" containerID="cri-o://3d3746cae99fa622248b8c25be00e5cea563131189657f26fd383333bb95fa2f" gracePeriod=30 Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.289686 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.291569 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.297792 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.302242 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7cc8fc8c44-b8pmd" podUID="62551774-7dc7-4727-a79d-f92d4f82d560" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.158:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.158:8443: connect: connection refused" Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.439543 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.439592 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.443389 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5cbfd7dcb-hzflv" podUID="f5570109-9e91-473c-8a41-47081ace3591" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.879261 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bppsk" event={"ID":"9b83e24a-ce7d-42b1-998f-1ede619914ff","Type":"ContainerStarted","Data":"1c15e13888a808881cccbd1943409e7a45ae2319eb7b3fd3c184ddcad2f2a202"} Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.897943 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-844d66f984-gvswh" event={"ID":"cd577cf0-d4de-4a57-9254-8a7bf61aa686","Type":"ContainerStarted","Data":"8125b4db30297e14e89afdc01e62faa9ad584a8fbd72a590149c0dd72e23efdd"} Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.898004 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-844d66f984-gvswh" event={"ID":"cd577cf0-d4de-4a57-9254-8a7bf61aa686","Type":"ContainerStarted","Data":"82a473928393188a891b21db28127a5c8ffc8842316ac5fc36630aa5dcaafeae"} Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.898852 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.898878 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.906702 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-bppsk" podStartSLOduration=4.231545077 podStartE2EDuration="50.90668218s" podCreationTimestamp="2025-12-02 20:20:22 +0000 UTC" firstStartedPulling="2025-12-02 20:20:24.803050408 +0000 UTC m=+1360.103957903" lastFinishedPulling="2025-12-02 20:21:11.478187511 +0000 UTC m=+1406.779095006" observedRunningTime="2025-12-02 20:21:12.900984661 +0000 UTC m=+1408.201892156" watchObservedRunningTime="2025-12-02 20:21:12.90668218 +0000 UTC m=+1408.207589675" Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.909112 4807 generic.go:334] "Generic (PLEG): container finished" podID="4e60dc88-91d1-4325-9832-f9a921502710" containerID="9bc8ba85975bbbc26b9ff4c698a8b1d9950252e39bb81c90774aedd118d44b21" exitCode=143 Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.909186 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"4e60dc88-91d1-4325-9832-f9a921502710","Type":"ContainerDied","Data":"9bc8ba85975bbbc26b9ff4c698a8b1d9950252e39bb81c90774aedd118d44b21"} Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.921030 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d4cd98589-bnbn5" event={"ID":"bcde4ab3-e62a-40bc-86b7-6d1c5e1af116","Type":"ContainerStarted","Data":"fc8251e55037d87e13ab654ac7c9e7e21221ce027d64b97a11ed91436517af2a"} Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.921079 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.943675 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-844d66f984-gvswh" podStartSLOduration=3.943641802 podStartE2EDuration="3.943641802s" podCreationTimestamp="2025-12-02 20:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:12.929604047 +0000 UTC m=+1408.230511552" watchObservedRunningTime="2025-12-02 20:21:12.943641802 +0000 UTC m=+1408.244549297" Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.950466 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:21:12 crc kubenswrapper[4807]: I1202 20:21:12.986577 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6d4cd98589-bnbn5" podStartSLOduration=3.98654845 podStartE2EDuration="3.98654845s" podCreationTimestamp="2025-12-02 20:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:12.966318592 +0000 UTC m=+1408.267226097" watchObservedRunningTime="2025-12-02 20:21:12.98654845 +0000 UTC m=+1408.287455945" Dec 02 20:21:13 crc kubenswrapper[4807]: I1202 20:21:13.123439 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-ln9hs"] Dec 02 20:21:13 crc kubenswrapper[4807]: I1202 20:21:13.123853 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" podUID="51d28406-b887-4c62-8b1c-f7005f6ee3c0" containerName="dnsmasq-dns" containerID="cri-o://3b720548325b56bbd73e3a091b36150cb610105cc5099286cdec944dcb1bb854" gracePeriod=10 Dec 02 20:21:13 crc kubenswrapper[4807]: I1202 20:21:13.320544 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" podUID="51d28406-b887-4c62-8b1c-f7005f6ee3c0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.155:5353: connect: connection refused" Dec 02 20:21:13 crc kubenswrapper[4807]: I1202 20:21:13.939782 4807 generic.go:334] "Generic (PLEG): container finished" podID="51d28406-b887-4c62-8b1c-f7005f6ee3c0" containerID="3b720548325b56bbd73e3a091b36150cb610105cc5099286cdec944dcb1bb854" exitCode=0 Dec 02 20:21:13 crc kubenswrapper[4807]: I1202 20:21:13.940776 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" event={"ID":"51d28406-b887-4c62-8b1c-f7005f6ee3c0","Type":"ContainerDied","Data":"3b720548325b56bbd73e3a091b36150cb610105cc5099286cdec944dcb1bb854"} Dec 02 20:21:15 crc kubenswrapper[4807]: I1202 20:21:15.533015 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="4e60dc88-91d1-4325-9832-f9a921502710" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9322/\": read tcp 10.217.0.2:57190->10.217.0.162:9322: read: connection reset by peer" Dec 02 20:21:15 crc kubenswrapper[4807]: I1202 20:21:15.533052 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="4e60dc88-91d1-4325-9832-f9a921502710" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.162:9322/\": read tcp 10.217.0.2:57204->10.217.0.162:9322: read: connection reset by peer" Dec 02 20:21:15 crc kubenswrapper[4807]: I1202 20:21:15.975307 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" event={"ID":"51d28406-b887-4c62-8b1c-f7005f6ee3c0","Type":"ContainerDied","Data":"8dd4f1dde0f276d96c434d2a450f7df08b9f6f566f9379d3ecfd69559e60c985"} Dec 02 20:21:15 crc kubenswrapper[4807]: I1202 20:21:15.975788 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dd4f1dde0f276d96c434d2a450f7df08b9f6f566f9379d3ecfd69559e60c985" Dec 02 20:21:15 crc kubenswrapper[4807]: I1202 20:21:15.978142 4807 generic.go:334] "Generic (PLEG): container finished" podID="4e60dc88-91d1-4325-9832-f9a921502710" containerID="3d3746cae99fa622248b8c25be00e5cea563131189657f26fd383333bb95fa2f" exitCode=0 Dec 02 20:21:15 crc kubenswrapper[4807]: I1202 20:21:15.978179 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"4e60dc88-91d1-4325-9832-f9a921502710","Type":"ContainerDied","Data":"3d3746cae99fa622248b8c25be00e5cea563131189657f26fd383333bb95fa2f"} Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.013825 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.077354 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-ovsdbserver-nb\") pod \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.077414 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-config\") pod \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.077587 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-dns-svc\") pod \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.077706 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz8pc\" (UniqueName: \"kubernetes.io/projected/51d28406-b887-4c62-8b1c-f7005f6ee3c0-kube-api-access-hz8pc\") pod \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.077750 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-dns-swift-storage-0\") pod \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.077796 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-ovsdbserver-sb\") pod \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\" (UID: \"51d28406-b887-4c62-8b1c-f7005f6ee3c0\") " Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.114087 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d28406-b887-4c62-8b1c-f7005f6ee3c0-kube-api-access-hz8pc" (OuterVolumeSpecName: "kube-api-access-hz8pc") pod "51d28406-b887-4c62-8b1c-f7005f6ee3c0" (UID: "51d28406-b887-4c62-8b1c-f7005f6ee3c0"). InnerVolumeSpecName "kube-api-access-hz8pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.153783 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "51d28406-b887-4c62-8b1c-f7005f6ee3c0" (UID: "51d28406-b887-4c62-8b1c-f7005f6ee3c0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.176856 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-config" (OuterVolumeSpecName: "config") pod "51d28406-b887-4c62-8b1c-f7005f6ee3c0" (UID: "51d28406-b887-4c62-8b1c-f7005f6ee3c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.180276 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.180378 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz8pc\" (UniqueName: \"kubernetes.io/projected/51d28406-b887-4c62-8b1c-f7005f6ee3c0-kube-api-access-hz8pc\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.180442 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.190355 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "51d28406-b887-4c62-8b1c-f7005f6ee3c0" (UID: "51d28406-b887-4c62-8b1c-f7005f6ee3c0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.203079 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "51d28406-b887-4c62-8b1c-f7005f6ee3c0" (UID: "51d28406-b887-4c62-8b1c-f7005f6ee3c0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.235314 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "51d28406-b887-4c62-8b1c-f7005f6ee3c0" (UID: "51d28406-b887-4c62-8b1c-f7005f6ee3c0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.282669 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.282700 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.282745 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51d28406-b887-4c62-8b1c-f7005f6ee3c0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.990636 4807 generic.go:334] "Generic (PLEG): container finished" podID="9b83e24a-ce7d-42b1-998f-1ede619914ff" containerID="1c15e13888a808881cccbd1943409e7a45ae2319eb7b3fd3c184ddcad2f2a202" exitCode=0 Dec 02 20:21:16 crc kubenswrapper[4807]: I1202 20:21:16.991824 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-ln9hs" Dec 02 20:21:17 crc kubenswrapper[4807]: I1202 20:21:17.004272 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bppsk" event={"ID":"9b83e24a-ce7d-42b1-998f-1ede619914ff","Type":"ContainerDied","Data":"1c15e13888a808881cccbd1943409e7a45ae2319eb7b3fd3c184ddcad2f2a202"} Dec 02 20:21:17 crc kubenswrapper[4807]: I1202 20:21:17.108885 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-ln9hs"] Dec 02 20:21:17 crc kubenswrapper[4807]: I1202 20:21:17.118181 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 02 20:21:17 crc kubenswrapper[4807]: I1202 20:21:17.118583 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 20:21:17 crc kubenswrapper[4807]: I1202 20:21:17.118614 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-ln9hs"] Dec 02 20:21:17 crc kubenswrapper[4807]: I1202 20:21:17.168102 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 02 20:21:17 crc kubenswrapper[4807]: I1202 20:21:17.305125 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Dec 02 20:21:17 crc kubenswrapper[4807]: I1202 20:21:17.331083 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 20:21:17 crc kubenswrapper[4807]: I1202 20:21:17.331135 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 20:21:17 crc kubenswrapper[4807]: I1202 20:21:17.335660 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Dec 02 20:21:17 crc kubenswrapper[4807]: I1202 20:21:17.378435 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 20:21:17 crc kubenswrapper[4807]: I1202 20:21:17.403879 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 20:21:17 crc kubenswrapper[4807]: I1202 20:21:17.661427 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 20:21:17 crc kubenswrapper[4807]: I1202 20:21:17.661529 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 20:21:17 crc kubenswrapper[4807]: I1202 20:21:17.705222 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 20:21:17 crc kubenswrapper[4807]: I1202 20:21:17.707459 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 20:21:18 crc kubenswrapper[4807]: I1202 20:21:18.002387 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 20:21:18 crc kubenswrapper[4807]: I1202 20:21:18.002578 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 20:21:18 crc kubenswrapper[4807]: I1202 20:21:18.003025 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 20:21:18 crc kubenswrapper[4807]: I1202 20:21:18.003083 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 20:21:18 crc kubenswrapper[4807]: I1202 20:21:18.036113 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Dec 02 20:21:18 crc kubenswrapper[4807]: I1202 20:21:18.074983 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.040919 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d28406-b887-4c62-8b1c-f7005f6ee3c0" path="/var/lib/kubelet/pods/51d28406-b887-4c62-8b1c-f7005f6ee3c0/volumes" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.343676 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.396336 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bppsk" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.455307 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-custom-prometheus-ca\") pod \"4e60dc88-91d1-4325-9832-f9a921502710\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.455611 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7pbp\" (UniqueName: \"kubernetes.io/projected/4e60dc88-91d1-4325-9832-f9a921502710-kube-api-access-p7pbp\") pod \"4e60dc88-91d1-4325-9832-f9a921502710\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.455832 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e60dc88-91d1-4325-9832-f9a921502710-logs\") pod \"4e60dc88-91d1-4325-9832-f9a921502710\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.455898 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-combined-ca-bundle\") pod \"4e60dc88-91d1-4325-9832-f9a921502710\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.456025 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-config-data\") pod \"4e60dc88-91d1-4325-9832-f9a921502710\" (UID: \"4e60dc88-91d1-4325-9832-f9a921502710\") " Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.463667 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e60dc88-91d1-4325-9832-f9a921502710-logs" (OuterVolumeSpecName: "logs") pod "4e60dc88-91d1-4325-9832-f9a921502710" (UID: "4e60dc88-91d1-4325-9832-f9a921502710"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.478579 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e60dc88-91d1-4325-9832-f9a921502710-kube-api-access-p7pbp" (OuterVolumeSpecName: "kube-api-access-p7pbp") pod "4e60dc88-91d1-4325-9832-f9a921502710" (UID: "4e60dc88-91d1-4325-9832-f9a921502710"). InnerVolumeSpecName "kube-api-access-p7pbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.519645 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e60dc88-91d1-4325-9832-f9a921502710" (UID: "4e60dc88-91d1-4325-9832-f9a921502710"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.527174 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "4e60dc88-91d1-4325-9832-f9a921502710" (UID: "4e60dc88-91d1-4325-9832-f9a921502710"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.550928 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-config-data" (OuterVolumeSpecName: "config-data") pod "4e60dc88-91d1-4325-9832-f9a921502710" (UID: "4e60dc88-91d1-4325-9832-f9a921502710"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.558276 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b83e24a-ce7d-42b1-998f-1ede619914ff-combined-ca-bundle\") pod \"9b83e24a-ce7d-42b1-998f-1ede619914ff\" (UID: \"9b83e24a-ce7d-42b1-998f-1ede619914ff\") " Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.558479 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zvj5\" (UniqueName: \"kubernetes.io/projected/9b83e24a-ce7d-42b1-998f-1ede619914ff-kube-api-access-2zvj5\") pod \"9b83e24a-ce7d-42b1-998f-1ede619914ff\" (UID: \"9b83e24a-ce7d-42b1-998f-1ede619914ff\") " Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.558567 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b83e24a-ce7d-42b1-998f-1ede619914ff-db-sync-config-data\") pod \"9b83e24a-ce7d-42b1-998f-1ede619914ff\" (UID: \"9b83e24a-ce7d-42b1-998f-1ede619914ff\") " Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.560141 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7pbp\" (UniqueName: \"kubernetes.io/projected/4e60dc88-91d1-4325-9832-f9a921502710-kube-api-access-p7pbp\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.560168 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e60dc88-91d1-4325-9832-f9a921502710-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.560180 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.560190 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.560202 4807 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4e60dc88-91d1-4325-9832-f9a921502710-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.564969 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b83e24a-ce7d-42b1-998f-1ede619914ff-kube-api-access-2zvj5" (OuterVolumeSpecName: "kube-api-access-2zvj5") pod "9b83e24a-ce7d-42b1-998f-1ede619914ff" (UID: "9b83e24a-ce7d-42b1-998f-1ede619914ff"). InnerVolumeSpecName "kube-api-access-2zvj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.565680 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b83e24a-ce7d-42b1-998f-1ede619914ff-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9b83e24a-ce7d-42b1-998f-1ede619914ff" (UID: "9b83e24a-ce7d-42b1-998f-1ede619914ff"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.588989 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b83e24a-ce7d-42b1-998f-1ede619914ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b83e24a-ce7d-42b1-998f-1ede619914ff" (UID: "9b83e24a-ce7d-42b1-998f-1ede619914ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.661813 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zvj5\" (UniqueName: \"kubernetes.io/projected/9b83e24a-ce7d-42b1-998f-1ede619914ff-kube-api-access-2zvj5\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.661851 4807 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b83e24a-ce7d-42b1-998f-1ede619914ff-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:19 crc kubenswrapper[4807]: I1202 20:21:19.661862 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b83e24a-ce7d-42b1-998f-1ede619914ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.054035 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"4e60dc88-91d1-4325-9832-f9a921502710","Type":"ContainerDied","Data":"a1504e7928460dd554f17357b52f2d520bd8f14faaf4412ab17c1d0d008852f9"} Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.054396 4807 scope.go:117] "RemoveContainer" containerID="3d3746cae99fa622248b8c25be00e5cea563131189657f26fd383333bb95fa2f" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.054048 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.059899 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bppsk" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.059885 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bppsk" event={"ID":"9b83e24a-ce7d-42b1-998f-1ede619914ff","Type":"ContainerDied","Data":"506364bcd666e1786373d81dad748d5aab6c417cc27a956bb78b6143b7b0ffe3"} Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.060318 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="506364bcd666e1786373d81dad748d5aab6c417cc27a956bb78b6143b7b0ffe3" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.068508 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.068540 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.068785 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fc7d360-e78a-4042-98e1-f8a2b97c10ab","Type":"ContainerStarted","Data":"1a9e5779281cfa8d583cecb0461d40d778409de11d8e92227f9aa90367484c28"} Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.068954 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.068971 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.141136 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.153145 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.172383 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 02 20:21:20 crc kubenswrapper[4807]: E1202 20:21:20.172966 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e60dc88-91d1-4325-9832-f9a921502710" containerName="watcher-api-log" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.172990 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e60dc88-91d1-4325-9832-f9a921502710" containerName="watcher-api-log" Dec 02 20:21:20 crc kubenswrapper[4807]: E1202 20:21:20.173026 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b83e24a-ce7d-42b1-998f-1ede619914ff" containerName="barbican-db-sync" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.173033 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b83e24a-ce7d-42b1-998f-1ede619914ff" containerName="barbican-db-sync" Dec 02 20:21:20 crc kubenswrapper[4807]: E1202 20:21:20.173049 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d28406-b887-4c62-8b1c-f7005f6ee3c0" containerName="init" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.173056 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d28406-b887-4c62-8b1c-f7005f6ee3c0" containerName="init" Dec 02 20:21:20 crc kubenswrapper[4807]: E1202 20:21:20.173070 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d28406-b887-4c62-8b1c-f7005f6ee3c0" containerName="dnsmasq-dns" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.173076 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d28406-b887-4c62-8b1c-f7005f6ee3c0" containerName="dnsmasq-dns" Dec 02 20:21:20 crc kubenswrapper[4807]: E1202 20:21:20.173098 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e60dc88-91d1-4325-9832-f9a921502710" containerName="watcher-api" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.173104 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e60dc88-91d1-4325-9832-f9a921502710" containerName="watcher-api" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.173388 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b83e24a-ce7d-42b1-998f-1ede619914ff" containerName="barbican-db-sync" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.173431 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d28406-b887-4c62-8b1c-f7005f6ee3c0" containerName="dnsmasq-dns" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.173454 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e60dc88-91d1-4325-9832-f9a921502710" containerName="watcher-api" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.173508 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e60dc88-91d1-4325-9832-f9a921502710" containerName="watcher-api-log" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.175040 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.177953 4807 scope.go:117] "RemoveContainer" containerID="9bc8ba85975bbbc26b9ff4c698a8b1d9950252e39bb81c90774aedd118d44b21" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.183162 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.183505 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.183805 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.189485 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.323464 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc49afa3-486b-481f-bb06-5b9bb2701021-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.323973 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bc49afa3-486b-481f-bb06-5b9bb2701021-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.324004 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc49afa3-486b-481f-bb06-5b9bb2701021-config-data\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.324037 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxpp4\" (UniqueName: \"kubernetes.io/projected/bc49afa3-486b-481f-bb06-5b9bb2701021-kube-api-access-nxpp4\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.324076 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc49afa3-486b-481f-bb06-5b9bb2701021-logs\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.324113 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc49afa3-486b-481f-bb06-5b9bb2701021-public-tls-certs\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.324152 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc49afa3-486b-481f-bb06-5b9bb2701021-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.425766 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc49afa3-486b-481f-bb06-5b9bb2701021-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.425892 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc49afa3-486b-481f-bb06-5b9bb2701021-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.425929 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bc49afa3-486b-481f-bb06-5b9bb2701021-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.425960 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc49afa3-486b-481f-bb06-5b9bb2701021-config-data\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.426003 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxpp4\" (UniqueName: \"kubernetes.io/projected/bc49afa3-486b-481f-bb06-5b9bb2701021-kube-api-access-nxpp4\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.426040 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc49afa3-486b-481f-bb06-5b9bb2701021-logs\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.426080 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc49afa3-486b-481f-bb06-5b9bb2701021-public-tls-certs\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.431354 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc49afa3-486b-481f-bb06-5b9bb2701021-logs\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.438374 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc49afa3-486b-481f-bb06-5b9bb2701021-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.439529 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc49afa3-486b-481f-bb06-5b9bb2701021-public-tls-certs\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.446529 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/bc49afa3-486b-481f-bb06-5b9bb2701021-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.451615 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc49afa3-486b-481f-bb06-5b9bb2701021-config-data\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.456464 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc49afa3-486b-481f-bb06-5b9bb2701021-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.456816 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxpp4\" (UniqueName: \"kubernetes.io/projected/bc49afa3-486b-481f-bb06-5b9bb2701021-kube-api-access-nxpp4\") pod \"watcher-api-0\" (UID: \"bc49afa3-486b-481f-bb06-5b9bb2701021\") " pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.525436 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.848239 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-76444866d4-7vv98"] Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.851154 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.861225 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-j9mxl" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.861833 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.862020 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.865403 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/845bcc3a-0d65-4ba8-bbb0-6f95d4778851-logs\") pod \"barbican-keystone-listener-76444866d4-7vv98\" (UID: \"845bcc3a-0d65-4ba8-bbb0-6f95d4778851\") " pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.865495 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjngk\" (UniqueName: \"kubernetes.io/projected/845bcc3a-0d65-4ba8-bbb0-6f95d4778851-kube-api-access-pjngk\") pod \"barbican-keystone-listener-76444866d4-7vv98\" (UID: \"845bcc3a-0d65-4ba8-bbb0-6f95d4778851\") " pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.865528 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845bcc3a-0d65-4ba8-bbb0-6f95d4778851-config-data\") pod \"barbican-keystone-listener-76444866d4-7vv98\" (UID: \"845bcc3a-0d65-4ba8-bbb0-6f95d4778851\") " pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.865572 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/845bcc3a-0d65-4ba8-bbb0-6f95d4778851-config-data-custom\") pod \"barbican-keystone-listener-76444866d4-7vv98\" (UID: \"845bcc3a-0d65-4ba8-bbb0-6f95d4778851\") " pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.865637 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845bcc3a-0d65-4ba8-bbb0-6f95d4778851-combined-ca-bundle\") pod \"barbican-keystone-listener-76444866d4-7vv98\" (UID: \"845bcc3a-0d65-4ba8-bbb0-6f95d4778851\") " pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.867897 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-f5f4b885-lpj6r"] Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.869696 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.895427 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.920780 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-76444866d4-7vv98"] Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.971612 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/926c4543-c7c5-41aa-a5ed-46035ee41498-config-data-custom\") pod \"barbican-worker-f5f4b885-lpj6r\" (UID: \"926c4543-c7c5-41aa-a5ed-46035ee41498\") " pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.971666 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/845bcc3a-0d65-4ba8-bbb0-6f95d4778851-config-data-custom\") pod \"barbican-keystone-listener-76444866d4-7vv98\" (UID: \"845bcc3a-0d65-4ba8-bbb0-6f95d4778851\") " pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.971691 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/926c4543-c7c5-41aa-a5ed-46035ee41498-logs\") pod \"barbican-worker-f5f4b885-lpj6r\" (UID: \"926c4543-c7c5-41aa-a5ed-46035ee41498\") " pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.985233 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h4ft\" (UniqueName: \"kubernetes.io/projected/926c4543-c7c5-41aa-a5ed-46035ee41498-kube-api-access-9h4ft\") pod \"barbican-worker-f5f4b885-lpj6r\" (UID: \"926c4543-c7c5-41aa-a5ed-46035ee41498\") " pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.985352 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926c4543-c7c5-41aa-a5ed-46035ee41498-combined-ca-bundle\") pod \"barbican-worker-f5f4b885-lpj6r\" (UID: \"926c4543-c7c5-41aa-a5ed-46035ee41498\") " pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.985415 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845bcc3a-0d65-4ba8-bbb0-6f95d4778851-combined-ca-bundle\") pod \"barbican-keystone-listener-76444866d4-7vv98\" (UID: \"845bcc3a-0d65-4ba8-bbb0-6f95d4778851\") " pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.986127 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/845bcc3a-0d65-4ba8-bbb0-6f95d4778851-logs\") pod \"barbican-keystone-listener-76444866d4-7vv98\" (UID: \"845bcc3a-0d65-4ba8-bbb0-6f95d4778851\") " pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.986173 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926c4543-c7c5-41aa-a5ed-46035ee41498-config-data\") pod \"barbican-worker-f5f4b885-lpj6r\" (UID: \"926c4543-c7c5-41aa-a5ed-46035ee41498\") " pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.986268 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjngk\" (UniqueName: \"kubernetes.io/projected/845bcc3a-0d65-4ba8-bbb0-6f95d4778851-kube-api-access-pjngk\") pod \"barbican-keystone-listener-76444866d4-7vv98\" (UID: \"845bcc3a-0d65-4ba8-bbb0-6f95d4778851\") " pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.986311 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845bcc3a-0d65-4ba8-bbb0-6f95d4778851-config-data\") pod \"barbican-keystone-listener-76444866d4-7vv98\" (UID: \"845bcc3a-0d65-4ba8-bbb0-6f95d4778851\") " pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.987486 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/845bcc3a-0d65-4ba8-bbb0-6f95d4778851-logs\") pod \"barbican-keystone-listener-76444866d4-7vv98\" (UID: \"845bcc3a-0d65-4ba8-bbb0-6f95d4778851\") " pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:20 crc kubenswrapper[4807]: I1202 20:21:20.999934 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/845bcc3a-0d65-4ba8-bbb0-6f95d4778851-config-data-custom\") pod \"barbican-keystone-listener-76444866d4-7vv98\" (UID: \"845bcc3a-0d65-4ba8-bbb0-6f95d4778851\") " pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.007916 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845bcc3a-0d65-4ba8-bbb0-6f95d4778851-combined-ca-bundle\") pod \"barbican-keystone-listener-76444866d4-7vv98\" (UID: \"845bcc3a-0d65-4ba8-bbb0-6f95d4778851\") " pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.008837 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/845bcc3a-0d65-4ba8-bbb0-6f95d4778851-config-data\") pod \"barbican-keystone-listener-76444866d4-7vv98\" (UID: \"845bcc3a-0d65-4ba8-bbb0-6f95d4778851\") " pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.040112 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e60dc88-91d1-4325-9832-f9a921502710" path="/var/lib/kubelet/pods/4e60dc88-91d1-4325-9832-f9a921502710/volumes" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.052318 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjngk\" (UniqueName: \"kubernetes.io/projected/845bcc3a-0d65-4ba8-bbb0-6f95d4778851-kube-api-access-pjngk\") pod \"barbican-keystone-listener-76444866d4-7vv98\" (UID: \"845bcc3a-0d65-4ba8-bbb0-6f95d4778851\") " pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.065994 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-f5f4b885-lpj6r"] Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.117270 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926c4543-c7c5-41aa-a5ed-46035ee41498-config-data\") pod \"barbican-worker-f5f4b885-lpj6r\" (UID: \"926c4543-c7c5-41aa-a5ed-46035ee41498\") " pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.117899 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/926c4543-c7c5-41aa-a5ed-46035ee41498-config-data-custom\") pod \"barbican-worker-f5f4b885-lpj6r\" (UID: \"926c4543-c7c5-41aa-a5ed-46035ee41498\") " pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.117994 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/926c4543-c7c5-41aa-a5ed-46035ee41498-logs\") pod \"barbican-worker-f5f4b885-lpj6r\" (UID: \"926c4543-c7c5-41aa-a5ed-46035ee41498\") " pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.118120 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h4ft\" (UniqueName: \"kubernetes.io/projected/926c4543-c7c5-41aa-a5ed-46035ee41498-kube-api-access-9h4ft\") pod \"barbican-worker-f5f4b885-lpj6r\" (UID: \"926c4543-c7c5-41aa-a5ed-46035ee41498\") " pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.118176 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926c4543-c7c5-41aa-a5ed-46035ee41498-combined-ca-bundle\") pod \"barbican-worker-f5f4b885-lpj6r\" (UID: \"926c4543-c7c5-41aa-a5ed-46035ee41498\") " pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.121947 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/926c4543-c7c5-41aa-a5ed-46035ee41498-logs\") pod \"barbican-worker-f5f4b885-lpj6r\" (UID: \"926c4543-c7c5-41aa-a5ed-46035ee41498\") " pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.126372 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926c4543-c7c5-41aa-a5ed-46035ee41498-config-data\") pod \"barbican-worker-f5f4b885-lpj6r\" (UID: \"926c4543-c7c5-41aa-a5ed-46035ee41498\") " pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.128443 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/926c4543-c7c5-41aa-a5ed-46035ee41498-config-data-custom\") pod \"barbican-worker-f5f4b885-lpj6r\" (UID: \"926c4543-c7c5-41aa-a5ed-46035ee41498\") " pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.143114 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926c4543-c7c5-41aa-a5ed-46035ee41498-combined-ca-bundle\") pod \"barbican-worker-f5f4b885-lpj6r\" (UID: \"926c4543-c7c5-41aa-a5ed-46035ee41498\") " pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.167238 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h4ft\" (UniqueName: \"kubernetes.io/projected/926c4543-c7c5-41aa-a5ed-46035ee41498-kube-api-access-9h4ft\") pod \"barbican-worker-f5f4b885-lpj6r\" (UID: \"926c4543-c7c5-41aa-a5ed-46035ee41498\") " pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.234766 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-f5f4b885-lpj6r" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.236176 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76444866d4-7vv98" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.281776 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zlf57" event={"ID":"2b4b9175-26ae-4cff-8dd2-7682b1408271","Type":"ContainerStarted","Data":"ebf674255993b10f848b561bd58bf084a292128e378892fafe3b9572e0f922c8"} Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.323336 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-rxmrt"] Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.354132 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-rxmrt"] Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.354372 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.430103 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-zlf57" podStartSLOduration=5.244906028 podStartE2EDuration="1m0.430076479s" podCreationTimestamp="2025-12-02 20:20:21 +0000 UTC" firstStartedPulling="2025-12-02 20:20:24.00586877 +0000 UTC m=+1359.306776255" lastFinishedPulling="2025-12-02 20:21:19.191039211 +0000 UTC m=+1414.491946706" observedRunningTime="2025-12-02 20:21:21.396921519 +0000 UTC m=+1416.697829014" watchObservedRunningTime="2025-12-02 20:21:21.430076479 +0000 UTC m=+1416.730983974" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.484444 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.484526 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.484585 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.484636 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-config\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.484672 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9flw\" (UniqueName: \"kubernetes.io/projected/21597064-29fe-4ba7-8272-80d8fca543f5-kube-api-access-m9flw\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.484734 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.505163 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6d77974966-h2jgz"] Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.515177 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.521201 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.547838 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d77974966-h2jgz"] Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.586761 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.586854 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-config\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.586896 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9flw\" (UniqueName: \"kubernetes.io/projected/21597064-29fe-4ba7-8272-80d8fca543f5-kube-api-access-m9flw\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.586937 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.587003 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.587024 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.587957 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.588485 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.589519 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-config\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.590379 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.591149 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.631460 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9flw\" (UniqueName: \"kubernetes.io/projected/21597064-29fe-4ba7-8272-80d8fca543f5-kube-api-access-m9flw\") pod \"dnsmasq-dns-848cf88cfc-rxmrt\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.689030 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hc9k\" (UniqueName: \"kubernetes.io/projected/a08fc5b4-413a-42e3-85df-9fe9b8187746-kube-api-access-4hc9k\") pod \"barbican-api-6d77974966-h2jgz\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.689640 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-config-data-custom\") pod \"barbican-api-6d77974966-h2jgz\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.689685 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-combined-ca-bundle\") pod \"barbican-api-6d77974966-h2jgz\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.689758 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-config-data\") pod \"barbican-api-6d77974966-h2jgz\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.689808 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a08fc5b4-413a-42e3-85df-9fe9b8187746-logs\") pod \"barbican-api-6d77974966-h2jgz\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.777469 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.801872 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.803208 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-config-data-custom\") pod \"barbican-api-6d77974966-h2jgz\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.803284 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-combined-ca-bundle\") pod \"barbican-api-6d77974966-h2jgz\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.803332 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-config-data\") pod \"barbican-api-6d77974966-h2jgz\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.803370 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a08fc5b4-413a-42e3-85df-9fe9b8187746-logs\") pod \"barbican-api-6d77974966-h2jgz\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.803492 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hc9k\" (UniqueName: \"kubernetes.io/projected/a08fc5b4-413a-42e3-85df-9fe9b8187746-kube-api-access-4hc9k\") pod \"barbican-api-6d77974966-h2jgz\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.809116 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a08fc5b4-413a-42e3-85df-9fe9b8187746-logs\") pod \"barbican-api-6d77974966-h2jgz\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.814746 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-config-data\") pod \"barbican-api-6d77974966-h2jgz\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.826380 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-config-data-custom\") pod \"barbican-api-6d77974966-h2jgz\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.834551 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-combined-ca-bundle\") pod \"barbican-api-6d77974966-h2jgz\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.842933 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hc9k\" (UniqueName: \"kubernetes.io/projected/a08fc5b4-413a-42e3-85df-9fe9b8187746-kube-api-access-4hc9k\") pod \"barbican-api-6d77974966-h2jgz\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:21 crc kubenswrapper[4807]: I1202 20:21:21.852567 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.186553 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-f5f4b885-lpj6r"] Dec 02 20:21:22 crc kubenswrapper[4807]: W1202 20:21:22.203050 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod926c4543_c7c5_41aa_a5ed_46035ee41498.slice/crio-3e98c5123799a1381b73e92c9957b68d2819864db37c3b164991e9ee46f98f84 WatchSource:0}: Error finding container 3e98c5123799a1381b73e92c9957b68d2819864db37c3b164991e9ee46f98f84: Status 404 returned error can't find the container with id 3e98c5123799a1381b73e92c9957b68d2819864db37c3b164991e9ee46f98f84 Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.237561 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.238001 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.250524 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.269866 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.269994 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.280193 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="4e60dc88-91d1-4325-9832-f9a921502710" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.284642 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="4e60dc88-91d1-4325-9832-f9a921502710" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.162:9322/\": dial tcp 10.217.0.162:9322: i/o timeout (Client.Timeout exceeded while awaiting headers)" Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.302088 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.302438 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7cc8fc8c44-b8pmd" podUID="62551774-7dc7-4727-a79d-f92d4f82d560" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.158:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.158:8443: connect: connection refused" Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.312030 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-76444866d4-7vv98"] Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.367606 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f5f4b885-lpj6r" event={"ID":"926c4543-c7c5-41aa-a5ed-46035ee41498","Type":"ContainerStarted","Data":"3e98c5123799a1381b73e92c9957b68d2819864db37c3b164991e9ee46f98f84"} Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.378533 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76444866d4-7vv98" event={"ID":"845bcc3a-0d65-4ba8-bbb0-6f95d4778851","Type":"ContainerStarted","Data":"5c53d4a7e6be7113e16e29ffb0924b639401322892ffe1d896be3f3e06b84347"} Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.387242 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"bc49afa3-486b-481f-bb06-5b9bb2701021","Type":"ContainerStarted","Data":"7ae91a25d85ccb83467db6fc11808429e0a21278afd65e4d609fa98faee43f57"} Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.440816 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5cbfd7dcb-hzflv" podUID="f5570109-9e91-473c-8a41-47081ace3591" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.855660 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6d77974966-h2jgz"] Dec 02 20:21:22 crc kubenswrapper[4807]: I1202 20:21:22.927010 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-rxmrt"] Dec 02 20:21:23 crc kubenswrapper[4807]: I1202 20:21:23.466582 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"bc49afa3-486b-481f-bb06-5b9bb2701021","Type":"ContainerStarted","Data":"cfe068155dbcf2aaee15b0b747b2249caa9691f2b41dae209ba011411a2f9c33"} Dec 02 20:21:23 crc kubenswrapper[4807]: I1202 20:21:23.466905 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"bc49afa3-486b-481f-bb06-5b9bb2701021","Type":"ContainerStarted","Data":"cfe43eb78815cc7beb9f0ecbea7bcc4c7e400f876f090787c189d2102e35fff7"} Dec 02 20:21:23 crc kubenswrapper[4807]: I1202 20:21:23.468833 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 02 20:21:23 crc kubenswrapper[4807]: I1202 20:21:23.486633 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="bc49afa3-486b-481f-bb06-5b9bb2701021" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.171:9322/\": dial tcp 10.217.0.171:9322: connect: connection refused" Dec 02 20:21:23 crc kubenswrapper[4807]: I1202 20:21:23.487102 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d77974966-h2jgz" event={"ID":"a08fc5b4-413a-42e3-85df-9fe9b8187746","Type":"ContainerStarted","Data":"aa7decc0e999f64769c6f88655247cdf99b06a604ad869ab16fcdaaa5a44ebfd"} Dec 02 20:21:23 crc kubenswrapper[4807]: I1202 20:21:23.505990 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" event={"ID":"21597064-29fe-4ba7-8272-80d8fca543f5","Type":"ContainerStarted","Data":"a7502c37baa9397207f505a88734bf00bfa15cfd33bac185f887113271b1e7e1"} Dec 02 20:21:23 crc kubenswrapper[4807]: I1202 20:21:23.521063 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.52102473 podStartE2EDuration="3.52102473s" podCreationTimestamp="2025-12-02 20:21:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:23.503539633 +0000 UTC m=+1418.804447128" watchObservedRunningTime="2025-12-02 20:21:23.52102473 +0000 UTC m=+1418.821932225" Dec 02 20:21:24 crc kubenswrapper[4807]: I1202 20:21:24.538049 4807 generic.go:334] "Generic (PLEG): container finished" podID="21597064-29fe-4ba7-8272-80d8fca543f5" containerID="286cbc0941bd2fab8e733535d78d68500bd7cd331feab13973e3cc5c9a5f0311" exitCode=0 Dec 02 20:21:24 crc kubenswrapper[4807]: I1202 20:21:24.538348 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" event={"ID":"21597064-29fe-4ba7-8272-80d8fca543f5","Type":"ContainerDied","Data":"286cbc0941bd2fab8e733535d78d68500bd7cd331feab13973e3cc5c9a5f0311"} Dec 02 20:21:24 crc kubenswrapper[4807]: I1202 20:21:24.578806 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d77974966-h2jgz" event={"ID":"a08fc5b4-413a-42e3-85df-9fe9b8187746","Type":"ContainerStarted","Data":"6fe59c218356b690142b88b0acf56b55b69cbf11c5d0a7b16fa78f33877f57de"} Dec 02 20:21:24 crc kubenswrapper[4807]: I1202 20:21:24.578877 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d77974966-h2jgz" event={"ID":"a08fc5b4-413a-42e3-85df-9fe9b8187746","Type":"ContainerStarted","Data":"adaacc6b61133dcc824bf80520581fa4f62cfa038d2d36af7f1e8e1fc1d3634e"} Dec 02 20:21:24 crc kubenswrapper[4807]: I1202 20:21:24.578916 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:24 crc kubenswrapper[4807]: I1202 20:21:24.579033 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:24 crc kubenswrapper[4807]: I1202 20:21:24.628412 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6d77974966-h2jgz" podStartSLOduration=3.628393495 podStartE2EDuration="3.628393495s" podCreationTimestamp="2025-12-02 20:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:24.619793831 +0000 UTC m=+1419.920701326" watchObservedRunningTime="2025-12-02 20:21:24.628393495 +0000 UTC m=+1419.929300990" Dec 02 20:21:25 crc kubenswrapper[4807]: I1202 20:21:25.526406 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 02 20:21:25 crc kubenswrapper[4807]: I1202 20:21:25.616476 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" event={"ID":"21597064-29fe-4ba7-8272-80d8fca543f5","Type":"ContainerStarted","Data":"088bce9e2f37b63b77b6ac063f555963a5198efc4512305c375192867011c3b9"} Dec 02 20:21:25 crc kubenswrapper[4807]: I1202 20:21:25.617018 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:25 crc kubenswrapper[4807]: I1202 20:21:25.655447 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" podStartSLOduration=4.655410824 podStartE2EDuration="4.655410824s" podCreationTimestamp="2025-12-02 20:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:25.645007536 +0000 UTC m=+1420.945915031" watchObservedRunningTime="2025-12-02 20:21:25.655410824 +0000 UTC m=+1420.956318319" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.119064 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5dff7976bd-s4t8d"] Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.121498 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.130874 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.131053 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.158454 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5dff7976bd-s4t8d"] Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.293315 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d4efb1f-6d37-4673-94fc-33623db07604-logs\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.293831 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d4efb1f-6d37-4673-94fc-33623db07604-public-tls-certs\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.293864 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d4efb1f-6d37-4673-94fc-33623db07604-combined-ca-bundle\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.293942 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc57w\" (UniqueName: \"kubernetes.io/projected/0d4efb1f-6d37-4673-94fc-33623db07604-kube-api-access-lc57w\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.293991 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d4efb1f-6d37-4673-94fc-33623db07604-config-data-custom\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.294019 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d4efb1f-6d37-4673-94fc-33623db07604-internal-tls-certs\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.294061 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d4efb1f-6d37-4673-94fc-33623db07604-config-data\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.395956 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d4efb1f-6d37-4673-94fc-33623db07604-config-data-custom\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.396032 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d4efb1f-6d37-4673-94fc-33623db07604-internal-tls-certs\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.396076 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d4efb1f-6d37-4673-94fc-33623db07604-config-data\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.396111 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d4efb1f-6d37-4673-94fc-33623db07604-logs\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.396187 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d4efb1f-6d37-4673-94fc-33623db07604-public-tls-certs\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.396214 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d4efb1f-6d37-4673-94fc-33623db07604-combined-ca-bundle\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.396246 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc57w\" (UniqueName: \"kubernetes.io/projected/0d4efb1f-6d37-4673-94fc-33623db07604-kube-api-access-lc57w\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.397791 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d4efb1f-6d37-4673-94fc-33623db07604-logs\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.414259 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d4efb1f-6d37-4673-94fc-33623db07604-config-data\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.415176 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d4efb1f-6d37-4673-94fc-33623db07604-public-tls-certs\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.417519 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d4efb1f-6d37-4673-94fc-33623db07604-internal-tls-certs\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.420762 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc57w\" (UniqueName: \"kubernetes.io/projected/0d4efb1f-6d37-4673-94fc-33623db07604-kube-api-access-lc57w\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.422574 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d4efb1f-6d37-4673-94fc-33623db07604-combined-ca-bundle\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.432469 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d4efb1f-6d37-4673-94fc-33623db07604-config-data-custom\") pod \"barbican-api-5dff7976bd-s4t8d\" (UID: \"0d4efb1f-6d37-4673-94fc-33623db07604\") " pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.481022 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:26 crc kubenswrapper[4807]: I1202 20:21:26.627285 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 20:21:28 crc kubenswrapper[4807]: I1202 20:21:28.079515 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-779468644-dc9lw" Dec 02 20:21:28 crc kubenswrapper[4807]: I1202 20:21:28.119290 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5dff7976bd-s4t8d"] Dec 02 20:21:28 crc kubenswrapper[4807]: W1202 20:21:28.129938 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d4efb1f_6d37_4673_94fc_33623db07604.slice/crio-f33a001ba2bf92477ba10f513ca3f956dd6e37abe986ffbe6c08b2dc0c47df1f WatchSource:0}: Error finding container f33a001ba2bf92477ba10f513ca3f956dd6e37abe986ffbe6c08b2dc0c47df1f: Status 404 returned error can't find the container with id f33a001ba2bf92477ba10f513ca3f956dd6e37abe986ffbe6c08b2dc0c47df1f Dec 02 20:21:28 crc kubenswrapper[4807]: I1202 20:21:28.678824 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dff7976bd-s4t8d" event={"ID":"0d4efb1f-6d37-4673-94fc-33623db07604","Type":"ContainerStarted","Data":"51eac17e5254ad63a117035eb896d0aa52b9f2d0994cc7bae8dc29c88e0acb33"} Dec 02 20:21:28 crc kubenswrapper[4807]: I1202 20:21:28.679628 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dff7976bd-s4t8d" event={"ID":"0d4efb1f-6d37-4673-94fc-33623db07604","Type":"ContainerStarted","Data":"f33a001ba2bf92477ba10f513ca3f956dd6e37abe986ffbe6c08b2dc0c47df1f"} Dec 02 20:21:28 crc kubenswrapper[4807]: I1202 20:21:28.702181 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f5f4b885-lpj6r" event={"ID":"926c4543-c7c5-41aa-a5ed-46035ee41498","Type":"ContainerStarted","Data":"b773d08deb73e3f08c4747cb4865bb2fdaa32a4f518d7ef61c868a6088ff0608"} Dec 02 20:21:28 crc kubenswrapper[4807]: I1202 20:21:28.702252 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f5f4b885-lpj6r" event={"ID":"926c4543-c7c5-41aa-a5ed-46035ee41498","Type":"ContainerStarted","Data":"48bf34e97ed34eb227e1fd98b4a5f1d799024126436a2463a000f923f1812b65"} Dec 02 20:21:28 crc kubenswrapper[4807]: I1202 20:21:28.718482 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76444866d4-7vv98" event={"ID":"845bcc3a-0d65-4ba8-bbb0-6f95d4778851","Type":"ContainerStarted","Data":"bb0f66786d7a11b61cdabc302c4fa21f95919f0ec0e29bd221b888aa924b9e5c"} Dec 02 20:21:28 crc kubenswrapper[4807]: I1202 20:21:28.718538 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76444866d4-7vv98" event={"ID":"845bcc3a-0d65-4ba8-bbb0-6f95d4778851","Type":"ContainerStarted","Data":"b8ae73d82af49b87fb3fe14229c76e44a7b07b0f1b297a68e1d58f77d548dcb0"} Dec 02 20:21:28 crc kubenswrapper[4807]: I1202 20:21:28.740006 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-f5f4b885-lpj6r" podStartSLOduration=3.50107234 podStartE2EDuration="8.739980107s" podCreationTimestamp="2025-12-02 20:21:20 +0000 UTC" firstStartedPulling="2025-12-02 20:21:22.217827866 +0000 UTC m=+1417.518735351" lastFinishedPulling="2025-12-02 20:21:27.456735623 +0000 UTC m=+1422.757643118" observedRunningTime="2025-12-02 20:21:28.717607095 +0000 UTC m=+1424.018514590" watchObservedRunningTime="2025-12-02 20:21:28.739980107 +0000 UTC m=+1424.040887602" Dec 02 20:21:29 crc kubenswrapper[4807]: I1202 20:21:29.584932 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="bc49afa3-486b-481f-bb06-5b9bb2701021" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.171:9322/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 20:21:29 crc kubenswrapper[4807]: I1202 20:21:29.605873 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 02 20:21:29 crc kubenswrapper[4807]: I1202 20:21:29.636154 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-76444866d4-7vv98" podStartSLOduration=4.546750821 podStartE2EDuration="9.636127708s" podCreationTimestamp="2025-12-02 20:21:20 +0000 UTC" firstStartedPulling="2025-12-02 20:21:22.328410725 +0000 UTC m=+1417.629318210" lastFinishedPulling="2025-12-02 20:21:27.417787602 +0000 UTC m=+1422.718695097" observedRunningTime="2025-12-02 20:21:28.758618578 +0000 UTC m=+1424.059526063" watchObservedRunningTime="2025-12-02 20:21:29.636127708 +0000 UTC m=+1424.937035203" Dec 02 20:21:29 crc kubenswrapper[4807]: I1202 20:21:29.731617 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dff7976bd-s4t8d" event={"ID":"0d4efb1f-6d37-4673-94fc-33623db07604","Type":"ContainerStarted","Data":"8384f0b5848fbaafb645c72f24c9eed3d6fc569e0a9fc52a99775ad32d640863"} Dec 02 20:21:29 crc kubenswrapper[4807]: I1202 20:21:29.733424 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:29 crc kubenswrapper[4807]: I1202 20:21:29.733458 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:29 crc kubenswrapper[4807]: I1202 20:21:29.735311 4807 generic.go:334] "Generic (PLEG): container finished" podID="2b4b9175-26ae-4cff-8dd2-7682b1408271" containerID="ebf674255993b10f848b561bd58bf084a292128e378892fafe3b9572e0f922c8" exitCode=0 Dec 02 20:21:29 crc kubenswrapper[4807]: I1202 20:21:29.735511 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zlf57" event={"ID":"2b4b9175-26ae-4cff-8dd2-7682b1408271","Type":"ContainerDied","Data":"ebf674255993b10f848b561bd58bf084a292128e378892fafe3b9572e0f922c8"} Dec 02 20:21:29 crc kubenswrapper[4807]: I1202 20:21:29.792758 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5dff7976bd-s4t8d" podStartSLOduration=3.792733617 podStartE2EDuration="3.792733617s" podCreationTimestamp="2025-12-02 20:21:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:29.759815214 +0000 UTC m=+1425.060722719" watchObservedRunningTime="2025-12-02 20:21:29.792733617 +0000 UTC m=+1425.093641112" Dec 02 20:21:30 crc kubenswrapper[4807]: I1202 20:21:30.528320 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Dec 02 20:21:30 crc kubenswrapper[4807]: I1202 20:21:30.564015 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Dec 02 20:21:30 crc kubenswrapper[4807]: I1202 20:21:30.772069 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 02 20:21:31 crc kubenswrapper[4807]: I1202 20:21:31.617603 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5597979745-dn972" Dec 02 20:21:31 crc kubenswrapper[4807]: I1202 20:21:31.714742 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-779468644-dc9lw"] Dec 02 20:21:31 crc kubenswrapper[4807]: I1202 20:21:31.715058 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-779468644-dc9lw" podUID="9f1c6d96-c355-4e05-8823-0f83ad828b3d" containerName="neutron-api" containerID="cri-o://09f9a52eda291674f8c6be27d86f7266ff9ac29b17474f0a591702b419e1e6c4" gracePeriod=30 Dec 02 20:21:31 crc kubenswrapper[4807]: I1202 20:21:31.715259 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-779468644-dc9lw" podUID="9f1c6d96-c355-4e05-8823-0f83ad828b3d" containerName="neutron-httpd" containerID="cri-o://385a45443efe28feba739e07293b4f8074e2dff09ff876a0db4ad5b92e9c4db5" gracePeriod=30 Dec 02 20:21:31 crc kubenswrapper[4807]: I1202 20:21:31.809051 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:31 crc kubenswrapper[4807]: I1202 20:21:31.897899 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-h5crw"] Dec 02 20:21:31 crc kubenswrapper[4807]: I1202 20:21:31.898182 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-h5crw" podUID="ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" containerName="dnsmasq-dns" containerID="cri-o://00aa61c4b03d62296708ec8fcf10db66f94b54f6c5c75bd017f7ad7fc374abec" gracePeriod=10 Dec 02 20:21:32 crc kubenswrapper[4807]: I1202 20:21:32.294273 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7cc8fc8c44-b8pmd" podUID="62551774-7dc7-4727-a79d-f92d4f82d560" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.158:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.158:8443: connect: connection refused" Dec 02 20:21:32 crc kubenswrapper[4807]: I1202 20:21:32.294395 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:21:32 crc kubenswrapper[4807]: I1202 20:21:32.295640 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"1a2bc25490b727e95a676dc6e738dd9535b3f9346206cb4c536bfdd80fd32d19"} pod="openstack/horizon-7cc8fc8c44-b8pmd" containerMessage="Container horizon failed startup probe, will be restarted" Dec 02 20:21:32 crc kubenswrapper[4807]: I1202 20:21:32.295698 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7cc8fc8c44-b8pmd" podUID="62551774-7dc7-4727-a79d-f92d4f82d560" containerName="horizon" containerID="cri-o://1a2bc25490b727e95a676dc6e738dd9535b3f9346206cb4c536bfdd80fd32d19" gracePeriod=30 Dec 02 20:21:32 crc kubenswrapper[4807]: I1202 20:21:32.820423 4807 generic.go:334] "Generic (PLEG): container finished" podID="9f1c6d96-c355-4e05-8823-0f83ad828b3d" containerID="385a45443efe28feba739e07293b4f8074e2dff09ff876a0db4ad5b92e9c4db5" exitCode=0 Dec 02 20:21:32 crc kubenswrapper[4807]: I1202 20:21:32.820529 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-779468644-dc9lw" event={"ID":"9f1c6d96-c355-4e05-8823-0f83ad828b3d","Type":"ContainerDied","Data":"385a45443efe28feba739e07293b4f8074e2dff09ff876a0db4ad5b92e9c4db5"} Dec 02 20:21:32 crc kubenswrapper[4807]: I1202 20:21:32.830351 4807 generic.go:334] "Generic (PLEG): container finished" podID="ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" containerID="00aa61c4b03d62296708ec8fcf10db66f94b54f6c5c75bd017f7ad7fc374abec" exitCode=0 Dec 02 20:21:32 crc kubenswrapper[4807]: I1202 20:21:32.830411 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-h5crw" event={"ID":"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc","Type":"ContainerDied","Data":"00aa61c4b03d62296708ec8fcf10db66f94b54f6c5c75bd017f7ad7fc374abec"} Dec 02 20:21:32 crc kubenswrapper[4807]: I1202 20:21:32.943530 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7b667979-h5crw" podUID="ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.166:5353: connect: connection refused" Dec 02 20:21:33 crc kubenswrapper[4807]: I1202 20:21:33.756066 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:34 crc kubenswrapper[4807]: I1202 20:21:34.642384 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:36 crc kubenswrapper[4807]: I1202 20:21:36.459204 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:21:37 crc kubenswrapper[4807]: I1202 20:21:37.943099 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7b667979-h5crw" podUID="ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.166:5353: connect: connection refused" Dec 02 20:21:38 crc kubenswrapper[4807]: I1202 20:21:38.487957 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:38 crc kubenswrapper[4807]: I1202 20:21:38.762331 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5cbfd7dcb-hzflv" Dec 02 20:21:38 crc kubenswrapper[4807]: I1202 20:21:38.853817 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cc8fc8c44-b8pmd"] Dec 02 20:21:39 crc kubenswrapper[4807]: I1202 20:21:39.124876 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5dff7976bd-s4t8d" Dec 02 20:21:39 crc kubenswrapper[4807]: I1202 20:21:39.224671 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6d77974966-h2jgz"] Dec 02 20:21:39 crc kubenswrapper[4807]: I1202 20:21:39.226824 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6d77974966-h2jgz" podUID="a08fc5b4-413a-42e3-85df-9fe9b8187746" containerName="barbican-api-log" containerID="cri-o://adaacc6b61133dcc824bf80520581fa4f62cfa038d2d36af7f1e8e1fc1d3634e" gracePeriod=30 Dec 02 20:21:39 crc kubenswrapper[4807]: I1202 20:21:39.227060 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6d77974966-h2jgz" podUID="a08fc5b4-413a-42e3-85df-9fe9b8187746" containerName="barbican-api" containerID="cri-o://6fe59c218356b690142b88b0acf56b55b69cbf11c5d0a7b16fa78f33877f57de" gracePeriod=30 Dec 02 20:21:39 crc kubenswrapper[4807]: E1202 20:21:39.573480 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda08fc5b4_413a_42e3_85df_9fe9b8187746.slice/crio-adaacc6b61133dcc824bf80520581fa4f62cfa038d2d36af7f1e8e1fc1d3634e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda08fc5b4_413a_42e3_85df_9fe9b8187746.slice/crio-conmon-adaacc6b61133dcc824bf80520581fa4f62cfa038d2d36af7f1e8e1fc1d3634e.scope\": RecentStats: unable to find data in memory cache]" Dec 02 20:21:39 crc kubenswrapper[4807]: I1202 20:21:39.953252 4807 generic.go:334] "Generic (PLEG): container finished" podID="9f1c6d96-c355-4e05-8823-0f83ad828b3d" containerID="09f9a52eda291674f8c6be27d86f7266ff9ac29b17474f0a591702b419e1e6c4" exitCode=0 Dec 02 20:21:39 crc kubenswrapper[4807]: I1202 20:21:39.953279 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-779468644-dc9lw" event={"ID":"9f1c6d96-c355-4e05-8823-0f83ad828b3d","Type":"ContainerDied","Data":"09f9a52eda291674f8c6be27d86f7266ff9ac29b17474f0a591702b419e1e6c4"} Dec 02 20:21:39 crc kubenswrapper[4807]: I1202 20:21:39.956940 4807 generic.go:334] "Generic (PLEG): container finished" podID="a08fc5b4-413a-42e3-85df-9fe9b8187746" containerID="adaacc6b61133dcc824bf80520581fa4f62cfa038d2d36af7f1e8e1fc1d3634e" exitCode=143 Dec 02 20:21:39 crc kubenswrapper[4807]: I1202 20:21:39.957029 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d77974966-h2jgz" event={"ID":"a08fc5b4-413a-42e3-85df-9fe9b8187746","Type":"ContainerDied","Data":"adaacc6b61133dcc824bf80520581fa4f62cfa038d2d36af7f1e8e1fc1d3634e"} Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.399618 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zlf57" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.482484 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-scripts\") pod \"2b4b9175-26ae-4cff-8dd2-7682b1408271\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.482628 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2b4b9175-26ae-4cff-8dd2-7682b1408271-etc-machine-id\") pod \"2b4b9175-26ae-4cff-8dd2-7682b1408271\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.482799 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zfcg\" (UniqueName: \"kubernetes.io/projected/2b4b9175-26ae-4cff-8dd2-7682b1408271-kube-api-access-7zfcg\") pod \"2b4b9175-26ae-4cff-8dd2-7682b1408271\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.482887 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-config-data\") pod \"2b4b9175-26ae-4cff-8dd2-7682b1408271\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.482930 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-combined-ca-bundle\") pod \"2b4b9175-26ae-4cff-8dd2-7682b1408271\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.482950 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-db-sync-config-data\") pod \"2b4b9175-26ae-4cff-8dd2-7682b1408271\" (UID: \"2b4b9175-26ae-4cff-8dd2-7682b1408271\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.488803 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2b4b9175-26ae-4cff-8dd2-7682b1408271" (UID: "2b4b9175-26ae-4cff-8dd2-7682b1408271"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.488911 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b4b9175-26ae-4cff-8dd2-7682b1408271-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2b4b9175-26ae-4cff-8dd2-7682b1408271" (UID: "2b4b9175-26ae-4cff-8dd2-7682b1408271"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.491001 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b4b9175-26ae-4cff-8dd2-7682b1408271-kube-api-access-7zfcg" (OuterVolumeSpecName: "kube-api-access-7zfcg") pod "2b4b9175-26ae-4cff-8dd2-7682b1408271" (UID: "2b4b9175-26ae-4cff-8dd2-7682b1408271"). InnerVolumeSpecName "kube-api-access-7zfcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.493955 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-scripts" (OuterVolumeSpecName: "scripts") pod "2b4b9175-26ae-4cff-8dd2-7682b1408271" (UID: "2b4b9175-26ae-4cff-8dd2-7682b1408271"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.584472 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.588258 4807 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2b4b9175-26ae-4cff-8dd2-7682b1408271-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.588316 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zfcg\" (UniqueName: \"kubernetes.io/projected/2b4b9175-26ae-4cff-8dd2-7682b1408271-kube-api-access-7zfcg\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.588326 4807 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.588338 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.631866 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b4b9175-26ae-4cff-8dd2-7682b1408271" (UID: "2b4b9175-26ae-4cff-8dd2-7682b1408271"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.671945 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-config-data" (OuterVolumeSpecName: "config-data") pod "2b4b9175-26ae-4cff-8dd2-7682b1408271" (UID: "2b4b9175-26ae-4cff-8dd2-7682b1408271"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.690351 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-ovsdbserver-sb\") pod \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.690447 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-dns-svc\") pod \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.690465 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-ovsdbserver-nb\") pod \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.690526 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-config\") pod \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.690546 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9zjm\" (UniqueName: \"kubernetes.io/projected/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-kube-api-access-m9zjm\") pod \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.690701 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-dns-swift-storage-0\") pod \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\" (UID: \"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.691175 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.691189 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4b9175-26ae-4cff-8dd2-7682b1408271-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.708332 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-kube-api-access-m9zjm" (OuterVolumeSpecName: "kube-api-access-m9zjm") pod "ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" (UID: "ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc"). InnerVolumeSpecName "kube-api-access-m9zjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.760565 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-779468644-dc9lw" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.795138 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-httpd-config\") pod \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.795290 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-ovndb-tls-certs\") pod \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.797338 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-combined-ca-bundle\") pod \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.797417 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtm6r\" (UniqueName: \"kubernetes.io/projected/9f1c6d96-c355-4e05-8823-0f83ad828b3d-kube-api-access-mtm6r\") pod \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.797487 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-config\") pod \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\" (UID: \"9f1c6d96-c355-4e05-8823-0f83ad828b3d\") " Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.798282 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9zjm\" (UniqueName: \"kubernetes.io/projected/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-kube-api-access-m9zjm\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.810280 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "9f1c6d96-c355-4e05-8823-0f83ad828b3d" (UID: "9f1c6d96-c355-4e05-8823-0f83ad828b3d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.810511 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" (UID: "ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.810524 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" (UID: "ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.829164 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f1c6d96-c355-4e05-8823-0f83ad828b3d-kube-api-access-mtm6r" (OuterVolumeSpecName: "kube-api-access-mtm6r") pod "9f1c6d96-c355-4e05-8823-0f83ad828b3d" (UID: "9f1c6d96-c355-4e05-8823-0f83ad828b3d"). InnerVolumeSpecName "kube-api-access-mtm6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.846543 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" (UID: "ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.873898 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" (UID: "ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.881934 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-config" (OuterVolumeSpecName: "config") pod "ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" (UID: "ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.885360 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-config" (OuterVolumeSpecName: "config") pod "9f1c6d96-c355-4e05-8823-0f83ad828b3d" (UID: "9f1c6d96-c355-4e05-8823-0f83ad828b3d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.901134 4807 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.901177 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.901191 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.901201 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.901211 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.901221 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtm6r\" (UniqueName: \"kubernetes.io/projected/9f1c6d96-c355-4e05-8823-0f83ad828b3d-kube-api-access-mtm6r\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.901232 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.901242 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.938968 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f1c6d96-c355-4e05-8823-0f83ad828b3d" (UID: "9f1c6d96-c355-4e05-8823-0f83ad828b3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.966963 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-779468644-dc9lw" event={"ID":"9f1c6d96-c355-4e05-8823-0f83ad828b3d","Type":"ContainerDied","Data":"f0b739c61fcefa1306e3cea6c38def130142d0b973363fe4b66f1422a3d7f635"} Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.967022 4807 scope.go:117] "RemoveContainer" containerID="385a45443efe28feba739e07293b4f8074e2dff09ff876a0db4ad5b92e9c4db5" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.967151 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-779468644-dc9lw" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.970074 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-h5crw" event={"ID":"ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc","Type":"ContainerDied","Data":"407cf19254ac1fd6dab9bb853d1022efa30f6216923c9107997a86fef00f3540"} Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.970143 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-h5crw" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.977343 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zlf57" Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.990054 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerName="ceilometer-central-agent" containerID="cri-o://68014bcc35f5cefa43ab4472ad58faa656b225d8d308e86a6303b1b2499aa487" gracePeriod=30 Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.990661 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerName="proxy-httpd" containerID="cri-o://c57848eb36d5129057028cc80e4d504065ffd64db4872dd0e2f24de4a1972e86" gracePeriod=30 Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.990750 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerName="sg-core" containerID="cri-o://1a9e5779281cfa8d583cecb0461d40d778409de11d8e92227f9aa90367484c28" gracePeriod=30 Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.990804 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerName="ceilometer-notification-agent" containerID="cri-o://151a5b20e12ef473a2ac64a0c900d10c72e66ef7e2f8c1288e3e6b8a6e8bdf9f" gracePeriod=30 Dec 02 20:21:40 crc kubenswrapper[4807]: I1202 20:21:40.998592 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "9f1c6d96-c355-4e05-8823-0f83ad828b3d" (UID: "9f1c6d96-c355-4e05-8823-0f83ad828b3d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.005598 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zlf57" event={"ID":"2b4b9175-26ae-4cff-8dd2-7682b1408271","Type":"ContainerDied","Data":"16cc6fefb2179350a0cc74626d6fbf5dfa16a404bd96de7b290381cc882313f1"} Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.005767 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16cc6fefb2179350a0cc74626d6fbf5dfa16a404bd96de7b290381cc882313f1" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.005854 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fc7d360-e78a-4042-98e1-f8a2b97c10ab","Type":"ContainerStarted","Data":"c57848eb36d5129057028cc80e4d504065ffd64db4872dd0e2f24de4a1972e86"} Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.005879 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.007617 4807 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.007637 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1c6d96-c355-4e05-8823-0f83ad828b3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.016191 4807 scope.go:117] "RemoveContainer" containerID="09f9a52eda291674f8c6be27d86f7266ff9ac29b17474f0a591702b419e1e6c4" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.021661 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.510240475 podStartE2EDuration="1m20.021634655s" podCreationTimestamp="2025-12-02 20:20:21 +0000 UTC" firstStartedPulling="2025-12-02 20:20:23.762269454 +0000 UTC m=+1359.063176949" lastFinishedPulling="2025-12-02 20:21:40.273663634 +0000 UTC m=+1435.574571129" observedRunningTime="2025-12-02 20:21:41.016478233 +0000 UTC m=+1436.317385728" watchObservedRunningTime="2025-12-02 20:21:41.021634655 +0000 UTC m=+1436.322542140" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.048779 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-h5crw"] Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.059872 4807 scope.go:117] "RemoveContainer" containerID="00aa61c4b03d62296708ec8fcf10db66f94b54f6c5c75bd017f7ad7fc374abec" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.063529 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-h5crw"] Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.084048 4807 scope.go:117] "RemoveContainer" containerID="5a07ee4176ba65fa0a6bdb417a528468f20d7f33c9a1db630d3deb5b84f7e191" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.355672 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-779468644-dc9lw"] Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.371250 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-779468644-dc9lw"] Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.890617 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-fdw8d"] Dec 02 20:21:41 crc kubenswrapper[4807]: E1202 20:21:41.891160 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4b9175-26ae-4cff-8dd2-7682b1408271" containerName="cinder-db-sync" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.891172 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4b9175-26ae-4cff-8dd2-7682b1408271" containerName="cinder-db-sync" Dec 02 20:21:41 crc kubenswrapper[4807]: E1202 20:21:41.891192 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1c6d96-c355-4e05-8823-0f83ad828b3d" containerName="neutron-api" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.891198 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1c6d96-c355-4e05-8823-0f83ad828b3d" containerName="neutron-api" Dec 02 20:21:41 crc kubenswrapper[4807]: E1202 20:21:41.891223 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1c6d96-c355-4e05-8823-0f83ad828b3d" containerName="neutron-httpd" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.891230 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1c6d96-c355-4e05-8823-0f83ad828b3d" containerName="neutron-httpd" Dec 02 20:21:41 crc kubenswrapper[4807]: E1202 20:21:41.891242 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" containerName="init" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.891248 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" containerName="init" Dec 02 20:21:41 crc kubenswrapper[4807]: E1202 20:21:41.891267 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" containerName="dnsmasq-dns" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.891273 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" containerName="dnsmasq-dns" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.891436 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" containerName="dnsmasq-dns" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.891465 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b4b9175-26ae-4cff-8dd2-7682b1408271" containerName="cinder-db-sync" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.891474 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f1c6d96-c355-4e05-8823-0f83ad828b3d" containerName="neutron-httpd" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.891489 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f1c6d96-c355-4e05-8823-0f83ad828b3d" containerName="neutron-api" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.892554 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.910768 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.912923 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.921986 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.922217 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.923358 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.926473 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-fdw8d"] Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.932448 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mc9wl" Dec 02 20:21:41 crc kubenswrapper[4807]: I1202 20:21:41.946814 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.046152 4807 generic.go:334] "Generic (PLEG): container finished" podID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerID="c57848eb36d5129057028cc80e4d504065ffd64db4872dd0e2f24de4a1972e86" exitCode=0 Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.046204 4807 generic.go:334] "Generic (PLEG): container finished" podID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerID="1a9e5779281cfa8d583cecb0461d40d778409de11d8e92227f9aa90367484c28" exitCode=2 Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.046220 4807 generic.go:334] "Generic (PLEG): container finished" podID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerID="68014bcc35f5cefa43ab4472ad58faa656b225d8d308e86a6303b1b2499aa487" exitCode=0 Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.046332 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fc7d360-e78a-4042-98e1-f8a2b97c10ab","Type":"ContainerDied","Data":"c57848eb36d5129057028cc80e4d504065ffd64db4872dd0e2f24de4a1972e86"} Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.046375 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fc7d360-e78a-4042-98e1-f8a2b97c10ab","Type":"ContainerDied","Data":"1a9e5779281cfa8d583cecb0461d40d778409de11d8e92227f9aa90367484c28"} Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.046391 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fc7d360-e78a-4042-98e1-f8a2b97c10ab","Type":"ContainerDied","Data":"68014bcc35f5cefa43ab4472ad58faa656b225d8d308e86a6303b1b2499aa487"} Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.051207 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtpcf\" (UniqueName: \"kubernetes.io/projected/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-kube-api-access-jtpcf\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.051357 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.051406 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlrvh\" (UniqueName: \"kubernetes.io/projected/c923e91c-9fb7-4246-9e86-44b80979855e-kube-api-access-tlrvh\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.051500 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-dns-svc\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.051524 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c923e91c-9fb7-4246-9e86-44b80979855e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.051550 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.051582 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.051637 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-config-data\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.051669 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-config\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.051748 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.051787 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-scripts\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.051811 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.154423 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtpcf\" (UniqueName: \"kubernetes.io/projected/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-kube-api-access-jtpcf\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.154517 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.154561 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlrvh\" (UniqueName: \"kubernetes.io/projected/c923e91c-9fb7-4246-9e86-44b80979855e-kube-api-access-tlrvh\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.154613 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-dns-svc\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.154633 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c923e91c-9fb7-4246-9e86-44b80979855e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.155321 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c923e91c-9fb7-4246-9e86-44b80979855e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.155769 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.155344 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.155877 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.155936 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-dns-svc\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.156466 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.156854 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-config-data\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.156943 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.158827 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-config\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.158944 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.158975 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-scripts\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.159006 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.159120 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.160895 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.161812 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-config\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.174769 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.175104 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.176138 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-scripts\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.180052 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.181425 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-config-data\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.192856 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtpcf\" (UniqueName: \"kubernetes.io/projected/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-kube-api-access-jtpcf\") pod \"dnsmasq-dns-6578955fd5-fdw8d\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.193998 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.213637 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlrvh\" (UniqueName: \"kubernetes.io/projected/c923e91c-9fb7-4246-9e86-44b80979855e-kube-api-access-tlrvh\") pod \"cinder-scheduler-0\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.221507 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.256271 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.266097 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f656871-7f26-4816-932c-326222105302-logs\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.266197 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-config-data\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.266337 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-config-data-custom\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.266410 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f656871-7f26-4816-932c-326222105302-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.266447 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.266522 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-scripts\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.266562 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drlxj\" (UniqueName: \"kubernetes.io/projected/4f656871-7f26-4816-932c-326222105302-kube-api-access-drlxj\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.376107 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f656871-7f26-4816-932c-326222105302-logs\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.376467 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-config-data\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.376565 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-config-data-custom\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.376620 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f656871-7f26-4816-932c-326222105302-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.376649 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.376711 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-scripts\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.376769 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drlxj\" (UniqueName: \"kubernetes.io/projected/4f656871-7f26-4816-932c-326222105302-kube-api-access-drlxj\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.377855 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f656871-7f26-4816-932c-326222105302-logs\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.378743 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f656871-7f26-4816-932c-326222105302-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.389314 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-scripts\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.395522 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-config-data\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.401525 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.407241 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drlxj\" (UniqueName: \"kubernetes.io/projected/4f656871-7f26-4816-932c-326222105302-kube-api-access-drlxj\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.431537 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-config-data-custom\") pod \"cinder-api-0\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.493380 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.718750 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6d77974966-h2jgz" podUID="a08fc5b4-413a-42e3-85df-9fe9b8187746" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.175:9311/healthcheck\": read tcp 10.217.0.2:47838->10.217.0.175:9311: read: connection reset by peer" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.732410 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6d77974966-h2jgz" podUID="a08fc5b4-413a-42e3-85df-9fe9b8187746" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.175:9311/healthcheck\": read tcp 10.217.0.2:47824->10.217.0.175:9311: read: connection reset by peer" Dec 02 20:21:42 crc kubenswrapper[4807]: I1202 20:21:42.911480 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.015732 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f1c6d96-c355-4e05-8823-0f83ad828b3d" path="/var/lib/kubelet/pods/9f1c6d96-c355-4e05-8823-0f83ad828b3d/volumes" Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.016303 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc" path="/var/lib/kubelet/pods/ad5dcefe-dfdd-40cb-bd3e-d3121ac92ccc/volumes" Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.022964 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-844d66f984-gvswh" Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.125920 4807 generic.go:334] "Generic (PLEG): container finished" podID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerID="151a5b20e12ef473a2ac64a0c900d10c72e66ef7e2f8c1288e3e6b8a6e8bdf9f" exitCode=0 Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.125985 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fc7d360-e78a-4042-98e1-f8a2b97c10ab","Type":"ContainerDied","Data":"151a5b20e12ef473a2ac64a0c900d10c72e66ef7e2f8c1288e3e6b8a6e8bdf9f"} Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.141911 4807 generic.go:334] "Generic (PLEG): container finished" podID="a08fc5b4-413a-42e3-85df-9fe9b8187746" containerID="6fe59c218356b690142b88b0acf56b55b69cbf11c5d0a7b16fa78f33877f57de" exitCode=0 Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.144127 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d77974966-h2jgz" event={"ID":"a08fc5b4-413a-42e3-85df-9fe9b8187746","Type":"ContainerDied","Data":"6fe59c218356b690142b88b0acf56b55b69cbf11c5d0a7b16fa78f33877f57de"} Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.222249 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-fdw8d"] Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.443791 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.656605 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.734499 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.755913 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-config-data-custom\") pod \"a08fc5b4-413a-42e3-85df-9fe9b8187746\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.756037 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a08fc5b4-413a-42e3-85df-9fe9b8187746-logs\") pod \"a08fc5b4-413a-42e3-85df-9fe9b8187746\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.756259 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-combined-ca-bundle\") pod \"a08fc5b4-413a-42e3-85df-9fe9b8187746\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.756299 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hc9k\" (UniqueName: \"kubernetes.io/projected/a08fc5b4-413a-42e3-85df-9fe9b8187746-kube-api-access-4hc9k\") pod \"a08fc5b4-413a-42e3-85df-9fe9b8187746\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.756383 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-config-data\") pod \"a08fc5b4-413a-42e3-85df-9fe9b8187746\" (UID: \"a08fc5b4-413a-42e3-85df-9fe9b8187746\") " Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.765891 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a08fc5b4-413a-42e3-85df-9fe9b8187746-logs" (OuterVolumeSpecName: "logs") pod "a08fc5b4-413a-42e3-85df-9fe9b8187746" (UID: "a08fc5b4-413a-42e3-85df-9fe9b8187746"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.784766 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08fc5b4-413a-42e3-85df-9fe9b8187746-kube-api-access-4hc9k" (OuterVolumeSpecName: "kube-api-access-4hc9k") pod "a08fc5b4-413a-42e3-85df-9fe9b8187746" (UID: "a08fc5b4-413a-42e3-85df-9fe9b8187746"). InnerVolumeSpecName "kube-api-access-4hc9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.800585 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a08fc5b4-413a-42e3-85df-9fe9b8187746" (UID: "a08fc5b4-413a-42e3-85df-9fe9b8187746"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.863297 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a08fc5b4-413a-42e3-85df-9fe9b8187746-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.863364 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hc9k\" (UniqueName: \"kubernetes.io/projected/a08fc5b4-413a-42e3-85df-9fe9b8187746-kube-api-access-4hc9k\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.863375 4807 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:43 crc kubenswrapper[4807]: I1202 20:21:43.997964 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a08fc5b4-413a-42e3-85df-9fe9b8187746" (UID: "a08fc5b4-413a-42e3-85df-9fe9b8187746"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.003142 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.021018 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-config-data" (OuterVolumeSpecName: "config-data") pod "a08fc5b4-413a-42e3-85df-9fe9b8187746" (UID: "a08fc5b4-413a-42e3-85df-9fe9b8187746"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.080608 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-run-httpd\") pod \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.080807 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-log-httpd\") pod \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.080883 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-combined-ca-bundle\") pod \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.080961 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w45f2\" (UniqueName: \"kubernetes.io/projected/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-kube-api-access-w45f2\") pod \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.081039 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-scripts\") pod \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.081080 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-sg-core-conf-yaml\") pod \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.081107 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-config-data\") pod \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\" (UID: \"4fc7d360-e78a-4042-98e1-f8a2b97c10ab\") " Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.082044 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.082065 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a08fc5b4-413a-42e3-85df-9fe9b8187746-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.083494 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4fc7d360-e78a-4042-98e1-f8a2b97c10ab" (UID: "4fc7d360-e78a-4042-98e1-f8a2b97c10ab"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.087121 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4fc7d360-e78a-4042-98e1-f8a2b97c10ab" (UID: "4fc7d360-e78a-4042-98e1-f8a2b97c10ab"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.125342 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-scripts" (OuterVolumeSpecName: "scripts") pod "4fc7d360-e78a-4042-98e1-f8a2b97c10ab" (UID: "4fc7d360-e78a-4042-98e1-f8a2b97c10ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.125539 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-kube-api-access-w45f2" (OuterVolumeSpecName: "kube-api-access-w45f2") pod "4fc7d360-e78a-4042-98e1-f8a2b97c10ab" (UID: "4fc7d360-e78a-4042-98e1-f8a2b97c10ab"). InnerVolumeSpecName "kube-api-access-w45f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.206860 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w45f2\" (UniqueName: \"kubernetes.io/projected/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-kube-api-access-w45f2\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.206911 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.206923 4807 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.206933 4807 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.259923 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4fc7d360-e78a-4042-98e1-f8a2b97c10ab" (UID: "4fc7d360-e78a-4042-98e1-f8a2b97c10ab"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.301236 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6d77974966-h2jgz" event={"ID":"a08fc5b4-413a-42e3-85df-9fe9b8187746","Type":"ContainerDied","Data":"aa7decc0e999f64769c6f88655247cdf99b06a604ad869ab16fcdaaa5a44ebfd"} Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.301317 4807 scope.go:117] "RemoveContainer" containerID="6fe59c218356b690142b88b0acf56b55b69cbf11c5d0a7b16fa78f33877f57de" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.301612 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6d77974966-h2jgz" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.319538 4807 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.324321 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4f656871-7f26-4816-932c-326222105302","Type":"ContainerStarted","Data":"03db5fc661238310d682346c625f4354cc827d02aef2ce3a21b1a38da23fd367"} Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.339170 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" event={"ID":"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72","Type":"ContainerStarted","Data":"11d5f7249a4d1bbb33484216e73637e9898513f7314137f7d0e766e6e40fd147"} Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.339233 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" event={"ID":"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72","Type":"ContainerStarted","Data":"bb3050f5926637bacffca6fc1d6709a0b4e3e23413d30ec6fe61e1511060a19c"} Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.396992 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4fc7d360-e78a-4042-98e1-f8a2b97c10ab","Type":"ContainerDied","Data":"c1b82c1c5a726dbfa4a4c28f9e6106afb8f5e0df950d1b4d2b5145fec07a8a46"} Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.397118 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.402052 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c923e91c-9fb7-4246-9e86-44b80979855e","Type":"ContainerStarted","Data":"4dacd556e71f2f98c4b4c26b35e298b3ac902e8f8e35453e60b6723319c99558"} Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.442677 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fc7d360-e78a-4042-98e1-f8a2b97c10ab" (UID: "4fc7d360-e78a-4042-98e1-f8a2b97c10ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.489990 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-config-data" (OuterVolumeSpecName: "config-data") pod "4fc7d360-e78a-4042-98e1-f8a2b97c10ab" (UID: "4fc7d360-e78a-4042-98e1-f8a2b97c10ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.524640 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.524700 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fc7d360-e78a-4042-98e1-f8a2b97c10ab-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.550225 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6d77974966-h2jgz"] Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.564632 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6d77974966-h2jgz"] Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.670372 4807 scope.go:117] "RemoveContainer" containerID="adaacc6b61133dcc824bf80520581fa4f62cfa038d2d36af7f1e8e1fc1d3634e" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.751517 4807 scope.go:117] "RemoveContainer" containerID="c57848eb36d5129057028cc80e4d504065ffd64db4872dd0e2f24de4a1972e86" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.768372 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.785571 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.795836 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:21:44 crc kubenswrapper[4807]: E1202 20:21:44.797159 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerName="ceilometer-central-agent" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.797189 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerName="ceilometer-central-agent" Dec 02 20:21:44 crc kubenswrapper[4807]: E1202 20:21:44.797239 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08fc5b4-413a-42e3-85df-9fe9b8187746" containerName="barbican-api-log" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.797249 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08fc5b4-413a-42e3-85df-9fe9b8187746" containerName="barbican-api-log" Dec 02 20:21:44 crc kubenswrapper[4807]: E1202 20:21:44.797275 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerName="ceilometer-notification-agent" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.797286 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerName="ceilometer-notification-agent" Dec 02 20:21:44 crc kubenswrapper[4807]: E1202 20:21:44.797328 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08fc5b4-413a-42e3-85df-9fe9b8187746" containerName="barbican-api" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.799843 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08fc5b4-413a-42e3-85df-9fe9b8187746" containerName="barbican-api" Dec 02 20:21:44 crc kubenswrapper[4807]: E1202 20:21:44.800008 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerName="proxy-httpd" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.800036 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerName="proxy-httpd" Dec 02 20:21:44 crc kubenswrapper[4807]: E1202 20:21:44.800088 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerName="sg-core" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.800098 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerName="sg-core" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.800759 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerName="ceilometer-central-agent" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.800786 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08fc5b4-413a-42e3-85df-9fe9b8187746" containerName="barbican-api-log" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.800810 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08fc5b4-413a-42e3-85df-9fe9b8187746" containerName="barbican-api" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.800830 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerName="proxy-httpd" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.800849 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerName="ceilometer-notification-agent" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.800864 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" containerName="sg-core" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.815116 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.815215 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6d4cd98589-bnbn5" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.815330 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.820260 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.822817 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.835676 4807 scope.go:117] "RemoveContainer" containerID="1a9e5779281cfa8d583cecb0461d40d778409de11d8e92227f9aa90367484c28" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.914160 4807 scope.go:117] "RemoveContainer" containerID="151a5b20e12ef473a2ac64a0c900d10c72e66ef7e2f8c1288e3e6b8a6e8bdf9f" Dec 02 20:21:44 crc kubenswrapper[4807]: I1202 20:21:44.976101 4807 scope.go:117] "RemoveContainer" containerID="68014bcc35f5cefa43ab4472ad58faa656b225d8d308e86a6303b1b2499aa487" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.002575 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.003082 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-config-data\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.003143 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9703018b-a14a-43a6-b680-0a29c7e734fd-log-httpd\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.003448 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsgf8\" (UniqueName: \"kubernetes.io/projected/9703018b-a14a-43a6-b680-0a29c7e734fd-kube-api-access-xsgf8\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.003603 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9703018b-a14a-43a6-b680-0a29c7e734fd-run-httpd\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.003697 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.003743 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-scripts\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.113301 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsgf8\" (UniqueName: \"kubernetes.io/projected/9703018b-a14a-43a6-b680-0a29c7e734fd-kube-api-access-xsgf8\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.113932 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9703018b-a14a-43a6-b680-0a29c7e734fd-run-httpd\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.114040 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-scripts\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.114074 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.114275 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.114583 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-config-data\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.114643 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9703018b-a14a-43a6-b680-0a29c7e734fd-log-httpd\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.115243 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9703018b-a14a-43a6-b680-0a29c7e734fd-run-httpd\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.116780 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9703018b-a14a-43a6-b680-0a29c7e734fd-log-httpd\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.121848 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.122151 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.133669 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.139349 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.140510 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-scripts\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.142204 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-config-data\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.162876 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsgf8\" (UniqueName: \"kubernetes.io/projected/9703018b-a14a-43a6-b680-0a29c7e734fd-kube-api-access-xsgf8\") pod \"ceilometer-0\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.209402 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fc7d360-e78a-4042-98e1-f8a2b97c10ab" path="/var/lib/kubelet/pods/4fc7d360-e78a-4042-98e1-f8a2b97c10ab/volumes" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.210285 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a08fc5b4-413a-42e3-85df-9fe9b8187746" path="/var/lib/kubelet/pods/a08fc5b4-413a-42e3-85df-9fe9b8187746/volumes" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.442074 4807 generic.go:334] "Generic (PLEG): container finished" podID="61a9f7ef-7bd0-49c9-8cd7-3eca252dca72" containerID="11d5f7249a4d1bbb33484216e73637e9898513f7314137f7d0e766e6e40fd147" exitCode=0 Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.442187 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" event={"ID":"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72","Type":"ContainerDied","Data":"11d5f7249a4d1bbb33484216e73637e9898513f7314137f7d0e766e6e40fd147"} Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.442216 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" event={"ID":"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72","Type":"ContainerStarted","Data":"d16103684b5bbaddf734cc78a6c310b6bb8f9eb58d50f341e5a85c8f8040d15d"} Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.444459 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.445565 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.469814 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4f656871-7f26-4816-932c-326222105302","Type":"ContainerStarted","Data":"493a3066702ab29611ce5ef62a2b39f60b8f0bd061915d34fc0391be7f5faa1e"} Dec 02 20:21:45 crc kubenswrapper[4807]: I1202 20:21:45.482376 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" podStartSLOduration=4.482350098 podStartE2EDuration="4.482350098s" podCreationTimestamp="2025-12-02 20:21:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:45.471376154 +0000 UTC m=+1440.772283649" watchObservedRunningTime="2025-12-02 20:21:45.482350098 +0000 UTC m=+1440.783257593" Dec 02 20:21:46 crc kubenswrapper[4807]: I1202 20:21:46.178421 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 20:21:46 crc kubenswrapper[4807]: I1202 20:21:46.190705 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:21:46 crc kubenswrapper[4807]: I1202 20:21:46.564548 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4f656871-7f26-4816-932c-326222105302","Type":"ContainerStarted","Data":"5ced8fe7447409cf933a8be34d3ea19b317ca352ab7d7197b4e705dbadad94d4"} Dec 02 20:21:46 crc kubenswrapper[4807]: I1202 20:21:46.565113 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4f656871-7f26-4816-932c-326222105302" containerName="cinder-api-log" containerID="cri-o://493a3066702ab29611ce5ef62a2b39f60b8f0bd061915d34fc0391be7f5faa1e" gracePeriod=30 Dec 02 20:21:46 crc kubenswrapper[4807]: I1202 20:21:46.565198 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4f656871-7f26-4816-932c-326222105302" containerName="cinder-api" containerID="cri-o://5ced8fe7447409cf933a8be34d3ea19b317ca352ab7d7197b4e705dbadad94d4" gracePeriod=30 Dec 02 20:21:46 crc kubenswrapper[4807]: I1202 20:21:46.565484 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 20:21:46 crc kubenswrapper[4807]: I1202 20:21:46.573885 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9703018b-a14a-43a6-b680-0a29c7e734fd","Type":"ContainerStarted","Data":"530fc271b22fea83a6dcf6b3a180ffceaae0d6d51fab0120064a8897c5b9824a"} Dec 02 20:21:46 crc kubenswrapper[4807]: I1202 20:21:46.581274 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c923e91c-9fb7-4246-9e86-44b80979855e","Type":"ContainerStarted","Data":"208d13485cde16a92ad340b86623b207277c84223e68018f32ff78c1e017584c"} Dec 02 20:21:46 crc kubenswrapper[4807]: I1202 20:21:46.616310 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.616279397 podStartE2EDuration="4.616279397s" podCreationTimestamp="2025-12-02 20:21:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:46.60216179 +0000 UTC m=+1441.903069285" watchObservedRunningTime="2025-12-02 20:21:46.616279397 +0000 UTC m=+1441.917186892" Dec 02 20:21:47 crc kubenswrapper[4807]: I1202 20:21:47.608090 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9703018b-a14a-43a6-b680-0a29c7e734fd","Type":"ContainerStarted","Data":"c9624798dede7c456625a9d09b4525f07b4df33ab18d8bc416cb442386992a8d"} Dec 02 20:21:47 crc kubenswrapper[4807]: I1202 20:21:47.613036 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c923e91c-9fb7-4246-9e86-44b80979855e","Type":"ContainerStarted","Data":"01e2c4637f6823cccb3acfcb3e34949a409fd95e636c46c42398c7849ab82587"} Dec 02 20:21:47 crc kubenswrapper[4807]: I1202 20:21:47.622655 4807 generic.go:334] "Generic (PLEG): container finished" podID="4f656871-7f26-4816-932c-326222105302" containerID="493a3066702ab29611ce5ef62a2b39f60b8f0bd061915d34fc0391be7f5faa1e" exitCode=143 Dec 02 20:21:47 crc kubenswrapper[4807]: I1202 20:21:47.624206 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4f656871-7f26-4816-932c-326222105302","Type":"ContainerDied","Data":"493a3066702ab29611ce5ef62a2b39f60b8f0bd061915d34fc0391be7f5faa1e"} Dec 02 20:21:47 crc kubenswrapper[4807]: I1202 20:21:47.650153 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.30545842 podStartE2EDuration="6.65012937s" podCreationTimestamp="2025-12-02 20:21:41 +0000 UTC" firstStartedPulling="2025-12-02 20:21:43.470010341 +0000 UTC m=+1438.770917836" lastFinishedPulling="2025-12-02 20:21:44.814681291 +0000 UTC m=+1440.115588786" observedRunningTime="2025-12-02 20:21:47.639148035 +0000 UTC m=+1442.940055540" watchObservedRunningTime="2025-12-02 20:21:47.65012937 +0000 UTC m=+1442.951036865" Dec 02 20:21:48 crc kubenswrapper[4807]: I1202 20:21:48.637278 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9703018b-a14a-43a6-b680-0a29c7e734fd","Type":"ContainerStarted","Data":"ab302bb0fb63c84c0270fba97fd710bdfd6b0e56b5a534f17d053da947f84866"} Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.168611 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.170116 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.172880 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.172891 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.173189 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-sp29m" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.179914 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.337083 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr8s8\" (UniqueName: \"kubernetes.io/projected/0f6c2f22-8527-4428-a503-7aedd5635e6b-kube-api-access-wr8s8\") pod \"openstackclient\" (UID: \"0f6c2f22-8527-4428-a503-7aedd5635e6b\") " pod="openstack/openstackclient" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.337472 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6c2f22-8527-4428-a503-7aedd5635e6b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0f6c2f22-8527-4428-a503-7aedd5635e6b\") " pod="openstack/openstackclient" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.337587 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f6c2f22-8527-4428-a503-7aedd5635e6b-openstack-config-secret\") pod \"openstackclient\" (UID: \"0f6c2f22-8527-4428-a503-7aedd5635e6b\") " pod="openstack/openstackclient" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.337623 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f6c2f22-8527-4428-a503-7aedd5635e6b-openstack-config\") pod \"openstackclient\" (UID: \"0f6c2f22-8527-4428-a503-7aedd5635e6b\") " pod="openstack/openstackclient" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.440320 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f6c2f22-8527-4428-a503-7aedd5635e6b-openstack-config-secret\") pod \"openstackclient\" (UID: \"0f6c2f22-8527-4428-a503-7aedd5635e6b\") " pod="openstack/openstackclient" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.440419 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f6c2f22-8527-4428-a503-7aedd5635e6b-openstack-config\") pod \"openstackclient\" (UID: \"0f6c2f22-8527-4428-a503-7aedd5635e6b\") " pod="openstack/openstackclient" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.440454 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr8s8\" (UniqueName: \"kubernetes.io/projected/0f6c2f22-8527-4428-a503-7aedd5635e6b-kube-api-access-wr8s8\") pod \"openstackclient\" (UID: \"0f6c2f22-8527-4428-a503-7aedd5635e6b\") " pod="openstack/openstackclient" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.440503 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6c2f22-8527-4428-a503-7aedd5635e6b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0f6c2f22-8527-4428-a503-7aedd5635e6b\") " pod="openstack/openstackclient" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.441929 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0f6c2f22-8527-4428-a503-7aedd5635e6b-openstack-config\") pod \"openstackclient\" (UID: \"0f6c2f22-8527-4428-a503-7aedd5635e6b\") " pod="openstack/openstackclient" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.447543 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f6c2f22-8527-4428-a503-7aedd5635e6b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0f6c2f22-8527-4428-a503-7aedd5635e6b\") " pod="openstack/openstackclient" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.449294 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0f6c2f22-8527-4428-a503-7aedd5635e6b-openstack-config-secret\") pod \"openstackclient\" (UID: \"0f6c2f22-8527-4428-a503-7aedd5635e6b\") " pod="openstack/openstackclient" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.471364 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr8s8\" (UniqueName: \"kubernetes.io/projected/0f6c2f22-8527-4428-a503-7aedd5635e6b-kube-api-access-wr8s8\") pod \"openstackclient\" (UID: \"0f6c2f22-8527-4428-a503-7aedd5635e6b\") " pod="openstack/openstackclient" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.493475 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 20:21:49 crc kubenswrapper[4807]: I1202 20:21:49.657733 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9703018b-a14a-43a6-b680-0a29c7e734fd","Type":"ContainerStarted","Data":"45e64ab22e737f56f56b778637c988bbd807c4d6ee2d5839fc5c5c7fb5190739"} Dec 02 20:21:50 crc kubenswrapper[4807]: I1202 20:21:50.116200 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 20:21:50 crc kubenswrapper[4807]: I1202 20:21:50.677267 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9703018b-a14a-43a6-b680-0a29c7e734fd","Type":"ContainerStarted","Data":"342402e56084c324d14fbaabed363fa062033c990511ad4a68bcbaf840f6cfb5"} Dec 02 20:21:50 crc kubenswrapper[4807]: I1202 20:21:50.677804 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 20:21:50 crc kubenswrapper[4807]: I1202 20:21:50.685843 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0f6c2f22-8527-4428-a503-7aedd5635e6b","Type":"ContainerStarted","Data":"4b4b29db945c063aedf28c3895b5900f54a287907e07360bae59975921708dff"} Dec 02 20:21:50 crc kubenswrapper[4807]: I1202 20:21:50.716376 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.743294532 podStartE2EDuration="6.71634556s" podCreationTimestamp="2025-12-02 20:21:44 +0000 UTC" firstStartedPulling="2025-12-02 20:21:46.225388632 +0000 UTC m=+1441.526296127" lastFinishedPulling="2025-12-02 20:21:50.19843966 +0000 UTC m=+1445.499347155" observedRunningTime="2025-12-02 20:21:50.699404879 +0000 UTC m=+1446.000312384" watchObservedRunningTime="2025-12-02 20:21:50.71634556 +0000 UTC m=+1446.017253055" Dec 02 20:21:52 crc kubenswrapper[4807]: I1202 20:21:52.223978 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:21:52 crc kubenswrapper[4807]: I1202 20:21:52.257012 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 20:21:52 crc kubenswrapper[4807]: I1202 20:21:52.292013 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-rxmrt"] Dec 02 20:21:52 crc kubenswrapper[4807]: I1202 20:21:52.292380 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" podUID="21597064-29fe-4ba7-8272-80d8fca543f5" containerName="dnsmasq-dns" containerID="cri-o://088bce9e2f37b63b77b6ac063f555963a5198efc4512305c375192867011c3b9" gracePeriod=10 Dec 02 20:21:52 crc kubenswrapper[4807]: I1202 20:21:52.604336 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 20:21:52 crc kubenswrapper[4807]: I1202 20:21:52.714742 4807 generic.go:334] "Generic (PLEG): container finished" podID="21597064-29fe-4ba7-8272-80d8fca543f5" containerID="088bce9e2f37b63b77b6ac063f555963a5198efc4512305c375192867011c3b9" exitCode=0 Dec 02 20:21:52 crc kubenswrapper[4807]: I1202 20:21:52.714795 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" event={"ID":"21597064-29fe-4ba7-8272-80d8fca543f5","Type":"ContainerDied","Data":"088bce9e2f37b63b77b6ac063f555963a5198efc4512305c375192867011c3b9"} Dec 02 20:21:52 crc kubenswrapper[4807]: I1202 20:21:52.764312 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.337232 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-9c96f4455-bvlsr"] Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.347583 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.350853 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.351513 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.351698 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.407123 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-9c96f4455-bvlsr"] Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.458920 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f92xc\" (UniqueName: \"kubernetes.io/projected/60cf7565-bc2c-469d-a0ad-400e95d69528-kube-api-access-f92xc\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.459025 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60cf7565-bc2c-469d-a0ad-400e95d69528-run-httpd\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.459089 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60cf7565-bc2c-469d-a0ad-400e95d69528-internal-tls-certs\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.459376 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60cf7565-bc2c-469d-a0ad-400e95d69528-log-httpd\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.459496 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/60cf7565-bc2c-469d-a0ad-400e95d69528-etc-swift\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.459527 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60cf7565-bc2c-469d-a0ad-400e95d69528-config-data\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.459600 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60cf7565-bc2c-469d-a0ad-400e95d69528-combined-ca-bundle\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.459764 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60cf7565-bc2c-469d-a0ad-400e95d69528-public-tls-certs\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.562730 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/60cf7565-bc2c-469d-a0ad-400e95d69528-etc-swift\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.562811 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60cf7565-bc2c-469d-a0ad-400e95d69528-config-data\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.562878 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60cf7565-bc2c-469d-a0ad-400e95d69528-combined-ca-bundle\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.562955 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60cf7565-bc2c-469d-a0ad-400e95d69528-public-tls-certs\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.563010 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f92xc\" (UniqueName: \"kubernetes.io/projected/60cf7565-bc2c-469d-a0ad-400e95d69528-kube-api-access-f92xc\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.563065 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60cf7565-bc2c-469d-a0ad-400e95d69528-run-httpd\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.563118 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60cf7565-bc2c-469d-a0ad-400e95d69528-internal-tls-certs\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.563175 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60cf7565-bc2c-469d-a0ad-400e95d69528-log-httpd\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.563891 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60cf7565-bc2c-469d-a0ad-400e95d69528-log-httpd\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.565483 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60cf7565-bc2c-469d-a0ad-400e95d69528-run-httpd\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.572529 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60cf7565-bc2c-469d-a0ad-400e95d69528-combined-ca-bundle\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.576440 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60cf7565-bc2c-469d-a0ad-400e95d69528-public-tls-certs\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.594509 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f92xc\" (UniqueName: \"kubernetes.io/projected/60cf7565-bc2c-469d-a0ad-400e95d69528-kube-api-access-f92xc\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.595582 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60cf7565-bc2c-469d-a0ad-400e95d69528-internal-tls-certs\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.598607 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60cf7565-bc2c-469d-a0ad-400e95d69528-config-data\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.600085 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/60cf7565-bc2c-469d-a0ad-400e95d69528-etc-swift\") pod \"swift-proxy-9c96f4455-bvlsr\" (UID: \"60cf7565-bc2c-469d-a0ad-400e95d69528\") " pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.693636 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.764343 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c923e91c-9fb7-4246-9e86-44b80979855e" containerName="cinder-scheduler" containerID="cri-o://208d13485cde16a92ad340b86623b207277c84223e68018f32ff78c1e017584c" gracePeriod=30 Dec 02 20:21:53 crc kubenswrapper[4807]: I1202 20:21:53.765170 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="c923e91c-9fb7-4246-9e86-44b80979855e" containerName="probe" containerID="cri-o://01e2c4637f6823cccb3acfcb3e34949a409fd95e636c46c42398c7849ab82587" gracePeriod=30 Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.033159 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.078825 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-ovsdbserver-sb\") pod \"21597064-29fe-4ba7-8272-80d8fca543f5\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.079311 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-config\") pod \"21597064-29fe-4ba7-8272-80d8fca543f5\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.079420 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-ovsdbserver-nb\") pod \"21597064-29fe-4ba7-8272-80d8fca543f5\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.079460 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9flw\" (UniqueName: \"kubernetes.io/projected/21597064-29fe-4ba7-8272-80d8fca543f5-kube-api-access-m9flw\") pod \"21597064-29fe-4ba7-8272-80d8fca543f5\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.079526 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-dns-swift-storage-0\") pod \"21597064-29fe-4ba7-8272-80d8fca543f5\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.079567 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-dns-svc\") pod \"21597064-29fe-4ba7-8272-80d8fca543f5\" (UID: \"21597064-29fe-4ba7-8272-80d8fca543f5\") " Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.120431 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21597064-29fe-4ba7-8272-80d8fca543f5-kube-api-access-m9flw" (OuterVolumeSpecName: "kube-api-access-m9flw") pod "21597064-29fe-4ba7-8272-80d8fca543f5" (UID: "21597064-29fe-4ba7-8272-80d8fca543f5"). InnerVolumeSpecName "kube-api-access-m9flw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.190519 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9flw\" (UniqueName: \"kubernetes.io/projected/21597064-29fe-4ba7-8272-80d8fca543f5-kube-api-access-m9flw\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.221003 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "21597064-29fe-4ba7-8272-80d8fca543f5" (UID: "21597064-29fe-4ba7-8272-80d8fca543f5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.233997 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-config" (OuterVolumeSpecName: "config") pod "21597064-29fe-4ba7-8272-80d8fca543f5" (UID: "21597064-29fe-4ba7-8272-80d8fca543f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.263369 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "21597064-29fe-4ba7-8272-80d8fca543f5" (UID: "21597064-29fe-4ba7-8272-80d8fca543f5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.311444 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.311492 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.311508 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.316557 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21597064-29fe-4ba7-8272-80d8fca543f5" (UID: "21597064-29fe-4ba7-8272-80d8fca543f5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.352519 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "21597064-29fe-4ba7-8272-80d8fca543f5" (UID: "21597064-29fe-4ba7-8272-80d8fca543f5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.413150 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.413196 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21597064-29fe-4ba7-8272-80d8fca543f5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.745260 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-9c96f4455-bvlsr"] Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.800134 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.801037 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-rxmrt" event={"ID":"21597064-29fe-4ba7-8272-80d8fca543f5","Type":"ContainerDied","Data":"a7502c37baa9397207f505a88734bf00bfa15cfd33bac185f887113271b1e7e1"} Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.801127 4807 scope.go:117] "RemoveContainer" containerID="088bce9e2f37b63b77b6ac063f555963a5198efc4512305c375192867011c3b9" Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.825388 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9c96f4455-bvlsr" event={"ID":"60cf7565-bc2c-469d-a0ad-400e95d69528","Type":"ContainerStarted","Data":"6e3868f59481a5edf243eaed2201e42cf831f33a6e0088a5739d4d728ab793c1"} Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.959578 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-rxmrt"] Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.970458 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-rxmrt"] Dec 02 20:21:54 crc kubenswrapper[4807]: I1202 20:21:54.995032 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21597064-29fe-4ba7-8272-80d8fca543f5" path="/var/lib/kubelet/pods/21597064-29fe-4ba7-8272-80d8fca543f5/volumes" Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.191462 4807 scope.go:117] "RemoveContainer" containerID="286cbc0941bd2fab8e733535d78d68500bd7cd331feab13973e3cc5c9a5f0311" Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.425664 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.426326 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerName="ceilometer-central-agent" containerID="cri-o://c9624798dede7c456625a9d09b4525f07b4df33ab18d8bc416cb442386992a8d" gracePeriod=30 Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.426348 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerName="sg-core" containerID="cri-o://45e64ab22e737f56f56b778637c988bbd807c4d6ee2d5839fc5c5c7fb5190739" gracePeriod=30 Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.426463 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerName="proxy-httpd" containerID="cri-o://342402e56084c324d14fbaabed363fa062033c990511ad4a68bcbaf840f6cfb5" gracePeriod=30 Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.426508 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerName="ceilometer-notification-agent" containerID="cri-o://ab302bb0fb63c84c0270fba97fd710bdfd6b0e56b5a534f17d053da947f84866" gracePeriod=30 Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.854810 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9c96f4455-bvlsr" event={"ID":"60cf7565-bc2c-469d-a0ad-400e95d69528","Type":"ContainerStarted","Data":"d37b64144ba4c247c9f911fd3342a46b30e0fd8772269a2da0c48a1cbe90b2b1"} Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.855086 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9c96f4455-bvlsr" event={"ID":"60cf7565-bc2c-469d-a0ad-400e95d69528","Type":"ContainerStarted","Data":"63de418b68506fc3f778eb0938bd2df73e82b0b1ee879056bc17c9904b97e2f6"} Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.855288 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.859863 4807 generic.go:334] "Generic (PLEG): container finished" podID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerID="342402e56084c324d14fbaabed363fa062033c990511ad4a68bcbaf840f6cfb5" exitCode=0 Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.859907 4807 generic.go:334] "Generic (PLEG): container finished" podID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerID="45e64ab22e737f56f56b778637c988bbd807c4d6ee2d5839fc5c5c7fb5190739" exitCode=2 Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.859910 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9703018b-a14a-43a6-b680-0a29c7e734fd","Type":"ContainerDied","Data":"342402e56084c324d14fbaabed363fa062033c990511ad4a68bcbaf840f6cfb5"} Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.859943 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9703018b-a14a-43a6-b680-0a29c7e734fd","Type":"ContainerDied","Data":"45e64ab22e737f56f56b778637c988bbd807c4d6ee2d5839fc5c5c7fb5190739"} Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.863901 4807 generic.go:334] "Generic (PLEG): container finished" podID="c923e91c-9fb7-4246-9e86-44b80979855e" containerID="01e2c4637f6823cccb3acfcb3e34949a409fd95e636c46c42398c7849ab82587" exitCode=0 Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.863921 4807 generic.go:334] "Generic (PLEG): container finished" podID="c923e91c-9fb7-4246-9e86-44b80979855e" containerID="208d13485cde16a92ad340b86623b207277c84223e68018f32ff78c1e017584c" exitCode=0 Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.863942 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c923e91c-9fb7-4246-9e86-44b80979855e","Type":"ContainerDied","Data":"01e2c4637f6823cccb3acfcb3e34949a409fd95e636c46c42398c7849ab82587"} Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.863957 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c923e91c-9fb7-4246-9e86-44b80979855e","Type":"ContainerDied","Data":"208d13485cde16a92ad340b86623b207277c84223e68018f32ff78c1e017584c"} Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.863967 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"c923e91c-9fb7-4246-9e86-44b80979855e","Type":"ContainerDied","Data":"4dacd556e71f2f98c4b4c26b35e298b3ac902e8f8e35453e60b6723319c99558"} Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.863978 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dacd556e71f2f98c4b4c26b35e298b3ac902e8f8e35453e60b6723319c99558" Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.887055 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-9c96f4455-bvlsr" podStartSLOduration=2.887041371 podStartE2EDuration="2.887041371s" podCreationTimestamp="2025-12-02 20:21:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:55.882512487 +0000 UTC m=+1451.183420002" watchObservedRunningTime="2025-12-02 20:21:55.887041371 +0000 UTC m=+1451.187948866" Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.889530 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.946294 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-scripts\") pod \"c923e91c-9fb7-4246-9e86-44b80979855e\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.946364 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c923e91c-9fb7-4246-9e86-44b80979855e-etc-machine-id\") pod \"c923e91c-9fb7-4246-9e86-44b80979855e\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.946402 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-combined-ca-bundle\") pod \"c923e91c-9fb7-4246-9e86-44b80979855e\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.946450 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-config-data-custom\") pod \"c923e91c-9fb7-4246-9e86-44b80979855e\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.946482 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlrvh\" (UniqueName: \"kubernetes.io/projected/c923e91c-9fb7-4246-9e86-44b80979855e-kube-api-access-tlrvh\") pod \"c923e91c-9fb7-4246-9e86-44b80979855e\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.946772 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-config-data\") pod \"c923e91c-9fb7-4246-9e86-44b80979855e\" (UID: \"c923e91c-9fb7-4246-9e86-44b80979855e\") " Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.950143 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c923e91c-9fb7-4246-9e86-44b80979855e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c923e91c-9fb7-4246-9e86-44b80979855e" (UID: "c923e91c-9fb7-4246-9e86-44b80979855e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.960397 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c923e91c-9fb7-4246-9e86-44b80979855e" (UID: "c923e91c-9fb7-4246-9e86-44b80979855e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.963356 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c923e91c-9fb7-4246-9e86-44b80979855e-kube-api-access-tlrvh" (OuterVolumeSpecName: "kube-api-access-tlrvh") pod "c923e91c-9fb7-4246-9e86-44b80979855e" (UID: "c923e91c-9fb7-4246-9e86-44b80979855e"). InnerVolumeSpecName "kube-api-access-tlrvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:21:55 crc kubenswrapper[4807]: I1202 20:21:55.963506 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-scripts" (OuterVolumeSpecName: "scripts") pod "c923e91c-9fb7-4246-9e86-44b80979855e" (UID: "c923e91c-9fb7-4246-9e86-44b80979855e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.049895 4807 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.049930 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlrvh\" (UniqueName: \"kubernetes.io/projected/c923e91c-9fb7-4246-9e86-44b80979855e-kube-api-access-tlrvh\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.049944 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.049958 4807 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c923e91c-9fb7-4246-9e86-44b80979855e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.061896 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c923e91c-9fb7-4246-9e86-44b80979855e" (UID: "c923e91c-9fb7-4246-9e86-44b80979855e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.144204 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-config-data" (OuterVolumeSpecName: "config-data") pod "c923e91c-9fb7-4246-9e86-44b80979855e" (UID: "c923e91c-9fb7-4246-9e86-44b80979855e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.156436 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.156471 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c923e91c-9fb7-4246-9e86-44b80979855e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.887113 4807 generic.go:334] "Generic (PLEG): container finished" podID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerID="ab302bb0fb63c84c0270fba97fd710bdfd6b0e56b5a534f17d053da947f84866" exitCode=0 Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.887467 4807 generic.go:334] "Generic (PLEG): container finished" podID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerID="c9624798dede7c456625a9d09b4525f07b4df33ab18d8bc416cb442386992a8d" exitCode=0 Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.887352 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9703018b-a14a-43a6-b680-0a29c7e734fd","Type":"ContainerDied","Data":"ab302bb0fb63c84c0270fba97fd710bdfd6b0e56b5a534f17d053da947f84866"} Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.888517 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9703018b-a14a-43a6-b680-0a29c7e734fd","Type":"ContainerDied","Data":"c9624798dede7c456625a9d09b4525f07b4df33ab18d8bc416cb442386992a8d"} Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.888532 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9703018b-a14a-43a6-b680-0a29c7e734fd","Type":"ContainerDied","Data":"530fc271b22fea83a6dcf6b3a180ffceaae0d6d51fab0120064a8897c5b9824a"} Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.888541 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="530fc271b22fea83a6dcf6b3a180ffceaae0d6d51fab0120064a8897c5b9824a" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.888608 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.890219 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.940263 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.951453 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.962458 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.990243 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c923e91c-9fb7-4246-9e86-44b80979855e" path="/var/lib/kubelet/pods/c923e91c-9fb7-4246-9e86-44b80979855e/volumes" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.994508 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 20:21:56 crc kubenswrapper[4807]: E1202 20:21:56.995353 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21597064-29fe-4ba7-8272-80d8fca543f5" containerName="dnsmasq-dns" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.995454 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="21597064-29fe-4ba7-8272-80d8fca543f5" containerName="dnsmasq-dns" Dec 02 20:21:56 crc kubenswrapper[4807]: E1202 20:21:56.995529 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerName="ceilometer-notification-agent" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.995587 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerName="ceilometer-notification-agent" Dec 02 20:21:56 crc kubenswrapper[4807]: E1202 20:21:56.995657 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerName="sg-core" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.995730 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerName="sg-core" Dec 02 20:21:56 crc kubenswrapper[4807]: E1202 20:21:56.995795 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c923e91c-9fb7-4246-9e86-44b80979855e" containerName="cinder-scheduler" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.995857 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c923e91c-9fb7-4246-9e86-44b80979855e" containerName="cinder-scheduler" Dec 02 20:21:56 crc kubenswrapper[4807]: E1202 20:21:56.995917 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerName="proxy-httpd" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.995967 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerName="proxy-httpd" Dec 02 20:21:56 crc kubenswrapper[4807]: E1202 20:21:56.996026 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerName="ceilometer-central-agent" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.996085 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerName="ceilometer-central-agent" Dec 02 20:21:56 crc kubenswrapper[4807]: E1202 20:21:56.996179 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c923e91c-9fb7-4246-9e86-44b80979855e" containerName="probe" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.996232 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c923e91c-9fb7-4246-9e86-44b80979855e" containerName="probe" Dec 02 20:21:56 crc kubenswrapper[4807]: E1202 20:21:56.996299 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21597064-29fe-4ba7-8272-80d8fca543f5" containerName="init" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.996363 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="21597064-29fe-4ba7-8272-80d8fca543f5" containerName="init" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.996694 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerName="ceilometer-notification-agent" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.996794 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerName="proxy-httpd" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.996864 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="c923e91c-9fb7-4246-9e86-44b80979855e" containerName="cinder-scheduler" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.996922 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerName="ceilometer-central-agent" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.996977 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="c923e91c-9fb7-4246-9e86-44b80979855e" containerName="probe" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.997031 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" containerName="sg-core" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.997088 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="21597064-29fe-4ba7-8272-80d8fca543f5" containerName="dnsmasq-dns" Dec 02 20:21:56 crc kubenswrapper[4807]: I1202 20:21:56.998672 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.002134 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.014600 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.077427 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsgf8\" (UniqueName: \"kubernetes.io/projected/9703018b-a14a-43a6-b680-0a29c7e734fd-kube-api-access-xsgf8\") pod \"9703018b-a14a-43a6-b680-0a29c7e734fd\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.077677 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9703018b-a14a-43a6-b680-0a29c7e734fd-log-httpd\") pod \"9703018b-a14a-43a6-b680-0a29c7e734fd\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.077737 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-combined-ca-bundle\") pod \"9703018b-a14a-43a6-b680-0a29c7e734fd\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.077764 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9703018b-a14a-43a6-b680-0a29c7e734fd-run-httpd\") pod \"9703018b-a14a-43a6-b680-0a29c7e734fd\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.077782 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-scripts\") pod \"9703018b-a14a-43a6-b680-0a29c7e734fd\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.077823 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-config-data\") pod \"9703018b-a14a-43a6-b680-0a29c7e734fd\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.077861 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-sg-core-conf-yaml\") pod \"9703018b-a14a-43a6-b680-0a29c7e734fd\" (UID: \"9703018b-a14a-43a6-b680-0a29c7e734fd\") " Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.078086 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7b23d9-c399-44ec-995e-54726ae83774-config-data\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.078149 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a7b23d9-c399-44ec-995e-54726ae83774-scripts\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.078178 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk4l4\" (UniqueName: \"kubernetes.io/projected/4a7b23d9-c399-44ec-995e-54726ae83774-kube-api-access-pk4l4\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.078218 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a7b23d9-c399-44ec-995e-54726ae83774-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.078281 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7b23d9-c399-44ec-995e-54726ae83774-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.078322 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a7b23d9-c399-44ec-995e-54726ae83774-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.079916 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9703018b-a14a-43a6-b680-0a29c7e734fd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9703018b-a14a-43a6-b680-0a29c7e734fd" (UID: "9703018b-a14a-43a6-b680-0a29c7e734fd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.081343 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9703018b-a14a-43a6-b680-0a29c7e734fd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9703018b-a14a-43a6-b680-0a29c7e734fd" (UID: "9703018b-a14a-43a6-b680-0a29c7e734fd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.087434 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9703018b-a14a-43a6-b680-0a29c7e734fd-kube-api-access-xsgf8" (OuterVolumeSpecName: "kube-api-access-xsgf8") pod "9703018b-a14a-43a6-b680-0a29c7e734fd" (UID: "9703018b-a14a-43a6-b680-0a29c7e734fd"). InnerVolumeSpecName "kube-api-access-xsgf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.108607 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.118229 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-scripts" (OuterVolumeSpecName: "scripts") pod "9703018b-a14a-43a6-b680-0a29c7e734fd" (UID: "9703018b-a14a-43a6-b680-0a29c7e734fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.179955 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7b23d9-c399-44ec-995e-54726ae83774-config-data\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.189002 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a7b23d9-c399-44ec-995e-54726ae83774-scripts\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.189113 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk4l4\" (UniqueName: \"kubernetes.io/projected/4a7b23d9-c399-44ec-995e-54726ae83774-kube-api-access-pk4l4\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.189233 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a7b23d9-c399-44ec-995e-54726ae83774-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.189367 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7b23d9-c399-44ec-995e-54726ae83774-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.189477 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a7b23d9-c399-44ec-995e-54726ae83774-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.189609 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsgf8\" (UniqueName: \"kubernetes.io/projected/9703018b-a14a-43a6-b680-0a29c7e734fd-kube-api-access-xsgf8\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.189625 4807 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9703018b-a14a-43a6-b680-0a29c7e734fd-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.189634 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.189643 4807 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9703018b-a14a-43a6-b680-0a29c7e734fd-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.192161 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a7b23d9-c399-44ec-995e-54726ae83774-config-data\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.192220 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a7b23d9-c399-44ec-995e-54726ae83774-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.194910 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a7b23d9-c399-44ec-995e-54726ae83774-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.197237 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a7b23d9-c399-44ec-995e-54726ae83774-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.209300 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a7b23d9-c399-44ec-995e-54726ae83774-scripts\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.211133 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk4l4\" (UniqueName: \"kubernetes.io/projected/4a7b23d9-c399-44ec-995e-54726ae83774-kube-api-access-pk4l4\") pod \"cinder-scheduler-0\" (UID: \"4a7b23d9-c399-44ec-995e-54726ae83774\") " pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.211900 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9703018b-a14a-43a6-b680-0a29c7e734fd" (UID: "9703018b-a14a-43a6-b680-0a29c7e734fd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.244838 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9703018b-a14a-43a6-b680-0a29c7e734fd" (UID: "9703018b-a14a-43a6-b680-0a29c7e734fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.278910 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-config-data" (OuterVolumeSpecName: "config-data") pod "9703018b-a14a-43a6-b680-0a29c7e734fd" (UID: "9703018b-a14a-43a6-b680-0a29c7e734fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.293276 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.293331 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.293340 4807 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9703018b-a14a-43a6-b680-0a29c7e734fd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.330404 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.898962 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.939417 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.953303 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:21:57 crc kubenswrapper[4807]: I1202 20:21:57.984365 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:57.996834 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.000110 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.007987 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.014066 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.014161 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.120891 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l7zd\" (UniqueName: \"kubernetes.io/projected/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-kube-api-access-9l7zd\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.120981 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.121076 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.121495 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-log-httpd\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.126944 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-config-data\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.127101 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-scripts\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.127202 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-run-httpd\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.229396 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-log-httpd\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.229501 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-config-data\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.229521 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-scripts\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.229541 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-run-httpd\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.229598 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l7zd\" (UniqueName: \"kubernetes.io/projected/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-kube-api-access-9l7zd\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.229619 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.229666 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.230511 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-log-httpd\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.230899 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-run-httpd\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.236323 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-config-data\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.237346 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.237931 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.243188 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-scripts\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.252762 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l7zd\" (UniqueName: \"kubernetes.io/projected/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-kube-api-access-9l7zd\") pod \"ceilometer-0\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.453799 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.926330 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4a7b23d9-c399-44ec-995e-54726ae83774","Type":"ContainerStarted","Data":"0a8acec8efe03efe21541bca3c9527b0b98a70e344bbb8acbd4329545e6cb97d"} Dec 02 20:21:58 crc kubenswrapper[4807]: I1202 20:21:58.926921 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4a7b23d9-c399-44ec-995e-54726ae83774","Type":"ContainerStarted","Data":"75a15ed78498a6f712ac5b2ef1fd9735161c5d4d529ddc3575168617bc596340"} Dec 02 20:21:59 crc kubenswrapper[4807]: I1202 20:21:59.026184 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9703018b-a14a-43a6-b680-0a29c7e734fd" path="/var/lib/kubelet/pods/9703018b-a14a-43a6-b680-0a29c7e734fd/volumes" Dec 02 20:21:59 crc kubenswrapper[4807]: I1202 20:21:59.206139 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:21:59 crc kubenswrapper[4807]: I1202 20:21:59.947821 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4a7b23d9-c399-44ec-995e-54726ae83774","Type":"ContainerStarted","Data":"b3d6ad285766b692d5f116b58b409d712a285d70adcda0693299271aeaf2fdaf"} Dec 02 20:21:59 crc kubenswrapper[4807]: I1202 20:21:59.950847 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1","Type":"ContainerStarted","Data":"cdb71427553e264d7690c904cdcc3ad763dbf0d5b4f68de32bff8bd40e2fcbd7"} Dec 02 20:21:59 crc kubenswrapper[4807]: I1202 20:21:59.990099 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.990073201 podStartE2EDuration="3.990073201s" podCreationTimestamp="2025-12-02 20:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:21:59.967698999 +0000 UTC m=+1455.268606494" watchObservedRunningTime="2025-12-02 20:21:59.990073201 +0000 UTC m=+1455.290980696" Dec 02 20:22:00 crc kubenswrapper[4807]: I1202 20:22:00.964931 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1","Type":"ContainerStarted","Data":"09e4f83dac6491d97d3f260dec631bdf13655deccee4f8d59eb829f923b744ae"} Dec 02 20:22:01 crc kubenswrapper[4807]: I1202 20:22:01.460469 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:02 crc kubenswrapper[4807]: I1202 20:22:02.331540 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 20:22:03 crc kubenswrapper[4807]: I1202 20:22:03.001393 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc8fc8c44-b8pmd" event={"ID":"62551774-7dc7-4727-a79d-f92d4f82d560","Type":"ContainerDied","Data":"1a2bc25490b727e95a676dc6e738dd9535b3f9346206cb4c536bfdd80fd32d19"} Dec 02 20:22:03 crc kubenswrapper[4807]: I1202 20:22:03.001325 4807 generic.go:334] "Generic (PLEG): container finished" podID="62551774-7dc7-4727-a79d-f92d4f82d560" containerID="1a2bc25490b727e95a676dc6e738dd9535b3f9346206cb4c536bfdd80fd32d19" exitCode=137 Dec 02 20:22:03 crc kubenswrapper[4807]: I1202 20:22:03.705923 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:22:03 crc kubenswrapper[4807]: I1202 20:22:03.707509 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-9c96f4455-bvlsr" Dec 02 20:22:07 crc kubenswrapper[4807]: I1202 20:22:07.636096 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 20:22:08 crc kubenswrapper[4807]: I1202 20:22:08.065002 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1","Type":"ContainerStarted","Data":"5dfebc0bed7f33a08fa20076c3d85f44fe7d3e88742feff016e12ba029d718c6"} Dec 02 20:22:08 crc kubenswrapper[4807]: I1202 20:22:08.069376 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0f6c2f22-8527-4428-a503-7aedd5635e6b","Type":"ContainerStarted","Data":"874ad873ac034dbc291607948651226734228b13c62f4d67f0ee75676bfa2478"} Dec 02 20:22:08 crc kubenswrapper[4807]: I1202 20:22:08.081870 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc8fc8c44-b8pmd" event={"ID":"62551774-7dc7-4727-a79d-f92d4f82d560","Type":"ContainerStarted","Data":"6140a25963321ba78bfcc620d218555820f13265bded3d755ce761deefdb6ef7"} Dec 02 20:22:08 crc kubenswrapper[4807]: I1202 20:22:08.082140 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7cc8fc8c44-b8pmd" podUID="62551774-7dc7-4727-a79d-f92d4f82d560" containerName="horizon-log" containerID="cri-o://acc5eee2fa3e2e060eea697bd28253234491412bf2a4bb3353d3abe041d69fa2" gracePeriod=30 Dec 02 20:22:08 crc kubenswrapper[4807]: I1202 20:22:08.082507 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7cc8fc8c44-b8pmd" podUID="62551774-7dc7-4727-a79d-f92d4f82d560" containerName="horizon" containerID="cri-o://6140a25963321ba78bfcc620d218555820f13265bded3d755ce761deefdb6ef7" gracePeriod=30 Dec 02 20:22:08 crc kubenswrapper[4807]: I1202 20:22:08.100487 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.120258739 podStartE2EDuration="19.100461211s" podCreationTimestamp="2025-12-02 20:21:49 +0000 UTC" firstStartedPulling="2025-12-02 20:21:50.166396613 +0000 UTC m=+1445.467304108" lastFinishedPulling="2025-12-02 20:22:07.146599085 +0000 UTC m=+1462.447506580" observedRunningTime="2025-12-02 20:22:08.09264383 +0000 UTC m=+1463.393551325" watchObservedRunningTime="2025-12-02 20:22:08.100461211 +0000 UTC m=+1463.401368706" Dec 02 20:22:09 crc kubenswrapper[4807]: I1202 20:22:09.099054 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1","Type":"ContainerStarted","Data":"47732380b94d7a27cfc907079dae39ca54e698ec0f075a743a6b883c19a2ea27"} Dec 02 20:22:10 crc kubenswrapper[4807]: I1202 20:22:10.122622 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1","Type":"ContainerStarted","Data":"f61970ea9d319d6dfdbde4f05220c8f50a611b1027f70751b2a41f910279e245"} Dec 02 20:22:10 crc kubenswrapper[4807]: I1202 20:22:10.122969 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerName="ceilometer-central-agent" containerID="cri-o://09e4f83dac6491d97d3f260dec631bdf13655deccee4f8d59eb829f923b744ae" gracePeriod=30 Dec 02 20:22:10 crc kubenswrapper[4807]: I1202 20:22:10.123271 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerName="proxy-httpd" containerID="cri-o://f61970ea9d319d6dfdbde4f05220c8f50a611b1027f70751b2a41f910279e245" gracePeriod=30 Dec 02 20:22:10 crc kubenswrapper[4807]: I1202 20:22:10.123658 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 20:22:10 crc kubenswrapper[4807]: I1202 20:22:10.123315 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerName="sg-core" containerID="cri-o://47732380b94d7a27cfc907079dae39ca54e698ec0f075a743a6b883c19a2ea27" gracePeriod=30 Dec 02 20:22:10 crc kubenswrapper[4807]: I1202 20:22:10.123285 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerName="ceilometer-notification-agent" containerID="cri-o://5dfebc0bed7f33a08fa20076c3d85f44fe7d3e88742feff016e12ba029d718c6" gracePeriod=30 Dec 02 20:22:10 crc kubenswrapper[4807]: I1202 20:22:10.158852 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.055058292 podStartE2EDuration="13.158818099s" podCreationTimestamp="2025-12-02 20:21:57 +0000 UTC" firstStartedPulling="2025-12-02 20:21:59.262699359 +0000 UTC m=+1454.563606854" lastFinishedPulling="2025-12-02 20:22:09.366459166 +0000 UTC m=+1464.667366661" observedRunningTime="2025-12-02 20:22:10.148318278 +0000 UTC m=+1465.449225773" watchObservedRunningTime="2025-12-02 20:22:10.158818099 +0000 UTC m=+1465.459725624" Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.137696 4807 generic.go:334] "Generic (PLEG): container finished" podID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerID="f61970ea9d319d6dfdbde4f05220c8f50a611b1027f70751b2a41f910279e245" exitCode=0 Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.138503 4807 generic.go:334] "Generic (PLEG): container finished" podID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerID="47732380b94d7a27cfc907079dae39ca54e698ec0f075a743a6b883c19a2ea27" exitCode=2 Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.138520 4807 generic.go:334] "Generic (PLEG): container finished" podID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerID="5dfebc0bed7f33a08fa20076c3d85f44fe7d3e88742feff016e12ba029d718c6" exitCode=0 Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.138532 4807 generic.go:334] "Generic (PLEG): container finished" podID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerID="09e4f83dac6491d97d3f260dec631bdf13655deccee4f8d59eb829f923b744ae" exitCode=0 Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.137766 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1","Type":"ContainerDied","Data":"f61970ea9d319d6dfdbde4f05220c8f50a611b1027f70751b2a41f910279e245"} Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.138581 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1","Type":"ContainerDied","Data":"47732380b94d7a27cfc907079dae39ca54e698ec0f075a743a6b883c19a2ea27"} Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.138611 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1","Type":"ContainerDied","Data":"5dfebc0bed7f33a08fa20076c3d85f44fe7d3e88742feff016e12ba029d718c6"} Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.138625 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1","Type":"ContainerDied","Data":"09e4f83dac6491d97d3f260dec631bdf13655deccee4f8d59eb829f923b744ae"} Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.138635 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1","Type":"ContainerDied","Data":"cdb71427553e264d7690c904cdcc3ad763dbf0d5b4f68de32bff8bd40e2fcbd7"} Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.138646 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdb71427553e264d7690c904cdcc3ad763dbf0d5b4f68de32bff8bd40e2fcbd7" Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.162271 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.304297 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-run-httpd\") pod \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.304443 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-log-httpd\") pod \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.304510 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-scripts\") pod \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.304599 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-sg-core-conf-yaml\") pod \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.304678 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-config-data\") pod \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.304759 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-combined-ca-bundle\") pod \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.304799 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l7zd\" (UniqueName: \"kubernetes.io/projected/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-kube-api-access-9l7zd\") pod \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\" (UID: \"bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1\") " Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.305922 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" (UID: "bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.306073 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" (UID: "bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.312090 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-kube-api-access-9l7zd" (OuterVolumeSpecName: "kube-api-access-9l7zd") pod "bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" (UID: "bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1"). InnerVolumeSpecName "kube-api-access-9l7zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.312850 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-scripts" (OuterVolumeSpecName: "scripts") pod "bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" (UID: "bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.348634 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" (UID: "bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.399249 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" (UID: "bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.406868 4807 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.406901 4807 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.406913 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.406923 4807 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.406934 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.406946 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l7zd\" (UniqueName: \"kubernetes.io/projected/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-kube-api-access-9l7zd\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.439174 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-config-data" (OuterVolumeSpecName: "config-data") pod "bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" (UID: "bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:11 crc kubenswrapper[4807]: I1202 20:22:11.508807 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.166562 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.213489 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.233958 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.248916 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:12 crc kubenswrapper[4807]: E1202 20:22:12.249706 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerName="proxy-httpd" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.249792 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerName="proxy-httpd" Dec 02 20:22:12 crc kubenswrapper[4807]: E1202 20:22:12.249875 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerName="ceilometer-notification-agent" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.249950 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerName="ceilometer-notification-agent" Dec 02 20:22:12 crc kubenswrapper[4807]: E1202 20:22:12.250011 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerName="sg-core" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.250061 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerName="sg-core" Dec 02 20:22:12 crc kubenswrapper[4807]: E1202 20:22:12.250115 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerName="ceilometer-central-agent" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.250168 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerName="ceilometer-central-agent" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.250428 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerName="ceilometer-central-agent" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.250492 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerName="sg-core" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.250567 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerName="proxy-httpd" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.250633 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" containerName="ceilometer-notification-agent" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.252658 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.255586 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.256449 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.263211 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.289384 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.428503 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.428944 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6mlt\" (UniqueName: \"kubernetes.io/projected/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-kube-api-access-j6mlt\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.428986 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-scripts\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.429010 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-log-httpd\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.429060 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.429080 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-config-data\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.429113 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-run-httpd\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.531963 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.532111 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6mlt\" (UniqueName: \"kubernetes.io/projected/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-kube-api-access-j6mlt\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.532156 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-scripts\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.532185 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-log-httpd\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.532227 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.532285 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-config-data\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.532322 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-run-httpd\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.532981 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-log-httpd\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.532985 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-run-httpd\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.540856 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.541439 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.543397 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-scripts\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.549247 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-config-data\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.553778 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6mlt\" (UniqueName: \"kubernetes.io/projected/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-kube-api-access-j6mlt\") pod \"ceilometer-0\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.585695 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:22:12 crc kubenswrapper[4807]: I1202 20:22:12.987851 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1" path="/var/lib/kubelet/pods/bd02cdf8-0d1d-4625-9dde-ec0c89e56bc1/volumes" Dec 02 20:22:13 crc kubenswrapper[4807]: I1202 20:22:13.129417 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:13 crc kubenswrapper[4807]: I1202 20:22:13.177687 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea104124-19ad-4aaa-8ee5-cbf24111ed4f","Type":"ContainerStarted","Data":"fcc3fa2edd102be421e45b8e6bc5ed3a53ff7541e4113976033053cc9bead6f6"} Dec 02 20:22:14 crc kubenswrapper[4807]: I1202 20:22:14.506257 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:15 crc kubenswrapper[4807]: I1202 20:22:15.204275 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea104124-19ad-4aaa-8ee5-cbf24111ed4f","Type":"ContainerStarted","Data":"0355a01dd916432a0f9b633efa09d905adadcb29d67e95267b540614d88aa257"} Dec 02 20:22:15 crc kubenswrapper[4807]: I1202 20:22:15.204867 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea104124-19ad-4aaa-8ee5-cbf24111ed4f","Type":"ContainerStarted","Data":"e2e70bc569a7569c022be93e464015b05f77c409760e111d8464bc1a2513f1b1"} Dec 02 20:22:16 crc kubenswrapper[4807]: I1202 20:22:16.170799 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 20:22:16 crc kubenswrapper[4807]: I1202 20:22:16.171568 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="95945b94-12eb-4077-b233-45c5c2b6b51d" containerName="watcher-decision-engine" containerID="cri-o://c87c2f0302a130cf41abe6822b2865ba1dc27741a63bce4c74e6b79bbbf2b18f" gracePeriod=30 Dec 02 20:22:16 crc kubenswrapper[4807]: I1202 20:22:16.224360 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea104124-19ad-4aaa-8ee5-cbf24111ed4f","Type":"ContainerStarted","Data":"0cbc4bad70b8310b16e320ddf5ec62e945616f5ced701471d648bb8da57dd93b"} Dec 02 20:22:17 crc kubenswrapper[4807]: E1202 20:22:17.121213 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c87c2f0302a130cf41abe6822b2865ba1dc27741a63bce4c74e6b79bbbf2b18f" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 02 20:22:17 crc kubenswrapper[4807]: E1202 20:22:17.127725 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c87c2f0302a130cf41abe6822b2865ba1dc27741a63bce4c74e6b79bbbf2b18f" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 02 20:22:17 crc kubenswrapper[4807]: E1202 20:22:17.129956 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c87c2f0302a130cf41abe6822b2865ba1dc27741a63bce4c74e6b79bbbf2b18f" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Dec 02 20:22:17 crc kubenswrapper[4807]: E1202 20:22:17.130067 4807 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-decision-engine-0" podUID="95945b94-12eb-4077-b233-45c5c2b6b51d" containerName="watcher-decision-engine" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.236837 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.237867 4807 generic.go:334] "Generic (PLEG): container finished" podID="4f656871-7f26-4816-932c-326222105302" containerID="5ced8fe7447409cf933a8be34d3ea19b317ca352ab7d7197b4e705dbadad94d4" exitCode=137 Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.237945 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4f656871-7f26-4816-932c-326222105302","Type":"ContainerDied","Data":"5ced8fe7447409cf933a8be34d3ea19b317ca352ab7d7197b4e705dbadad94d4"} Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.237995 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4f656871-7f26-4816-932c-326222105302","Type":"ContainerDied","Data":"03db5fc661238310d682346c625f4354cc827d02aef2ce3a21b1a38da23fd367"} Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.238036 4807 scope.go:117] "RemoveContainer" containerID="5ced8fe7447409cf933a8be34d3ea19b317ca352ab7d7197b4e705dbadad94d4" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.272776 4807 scope.go:117] "RemoveContainer" containerID="493a3066702ab29611ce5ef62a2b39f60b8f0bd061915d34fc0391be7f5faa1e" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.308612 4807 scope.go:117] "RemoveContainer" containerID="5ced8fe7447409cf933a8be34d3ea19b317ca352ab7d7197b4e705dbadad94d4" Dec 02 20:22:17 crc kubenswrapper[4807]: E1202 20:22:17.311362 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ced8fe7447409cf933a8be34d3ea19b317ca352ab7d7197b4e705dbadad94d4\": container with ID starting with 5ced8fe7447409cf933a8be34d3ea19b317ca352ab7d7197b4e705dbadad94d4 not found: ID does not exist" containerID="5ced8fe7447409cf933a8be34d3ea19b317ca352ab7d7197b4e705dbadad94d4" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.311426 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ced8fe7447409cf933a8be34d3ea19b317ca352ab7d7197b4e705dbadad94d4"} err="failed to get container status \"5ced8fe7447409cf933a8be34d3ea19b317ca352ab7d7197b4e705dbadad94d4\": rpc error: code = NotFound desc = could not find container \"5ced8fe7447409cf933a8be34d3ea19b317ca352ab7d7197b4e705dbadad94d4\": container with ID starting with 5ced8fe7447409cf933a8be34d3ea19b317ca352ab7d7197b4e705dbadad94d4 not found: ID does not exist" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.311474 4807 scope.go:117] "RemoveContainer" containerID="493a3066702ab29611ce5ef62a2b39f60b8f0bd061915d34fc0391be7f5faa1e" Dec 02 20:22:17 crc kubenswrapper[4807]: E1202 20:22:17.312055 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"493a3066702ab29611ce5ef62a2b39f60b8f0bd061915d34fc0391be7f5faa1e\": container with ID starting with 493a3066702ab29611ce5ef62a2b39f60b8f0bd061915d34fc0391be7f5faa1e not found: ID does not exist" containerID="493a3066702ab29611ce5ef62a2b39f60b8f0bd061915d34fc0391be7f5faa1e" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.312077 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"493a3066702ab29611ce5ef62a2b39f60b8f0bd061915d34fc0391be7f5faa1e"} err="failed to get container status \"493a3066702ab29611ce5ef62a2b39f60b8f0bd061915d34fc0391be7f5faa1e\": rpc error: code = NotFound desc = could not find container \"493a3066702ab29611ce5ef62a2b39f60b8f0bd061915d34fc0391be7f5faa1e\": container with ID starting with 493a3066702ab29611ce5ef62a2b39f60b8f0bd061915d34fc0391be7f5faa1e not found: ID does not exist" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.383708 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-combined-ca-bundle\") pod \"4f656871-7f26-4816-932c-326222105302\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.383843 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-config-data\") pod \"4f656871-7f26-4816-932c-326222105302\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.383931 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drlxj\" (UniqueName: \"kubernetes.io/projected/4f656871-7f26-4816-932c-326222105302-kube-api-access-drlxj\") pod \"4f656871-7f26-4816-932c-326222105302\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.384011 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-config-data-custom\") pod \"4f656871-7f26-4816-932c-326222105302\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.384037 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f656871-7f26-4816-932c-326222105302-etc-machine-id\") pod \"4f656871-7f26-4816-932c-326222105302\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.384075 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f656871-7f26-4816-932c-326222105302-logs\") pod \"4f656871-7f26-4816-932c-326222105302\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.384098 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-scripts\") pod \"4f656871-7f26-4816-932c-326222105302\" (UID: \"4f656871-7f26-4816-932c-326222105302\") " Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.384781 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f656871-7f26-4816-932c-326222105302-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4f656871-7f26-4816-932c-326222105302" (UID: "4f656871-7f26-4816-932c-326222105302"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.386692 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f656871-7f26-4816-932c-326222105302-logs" (OuterVolumeSpecName: "logs") pod "4f656871-7f26-4816-932c-326222105302" (UID: "4f656871-7f26-4816-932c-326222105302"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.393910 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f656871-7f26-4816-932c-326222105302-kube-api-access-drlxj" (OuterVolumeSpecName: "kube-api-access-drlxj") pod "4f656871-7f26-4816-932c-326222105302" (UID: "4f656871-7f26-4816-932c-326222105302"). InnerVolumeSpecName "kube-api-access-drlxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.394148 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-scripts" (OuterVolumeSpecName: "scripts") pod "4f656871-7f26-4816-932c-326222105302" (UID: "4f656871-7f26-4816-932c-326222105302"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.394969 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4f656871-7f26-4816-932c-326222105302" (UID: "4f656871-7f26-4816-932c-326222105302"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.420321 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f656871-7f26-4816-932c-326222105302" (UID: "4f656871-7f26-4816-932c-326222105302"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.486953 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.486999 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drlxj\" (UniqueName: \"kubernetes.io/projected/4f656871-7f26-4816-932c-326222105302-kube-api-access-drlxj\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.487013 4807 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.487022 4807 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f656871-7f26-4816-932c-326222105302-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.487036 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f656871-7f26-4816-932c-326222105302-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.487048 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.499099 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-config-data" (OuterVolumeSpecName: "config-data") pod "4f656871-7f26-4816-932c-326222105302" (UID: "4f656871-7f26-4816-932c-326222105302"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:17 crc kubenswrapper[4807]: I1202 20:22:17.589381 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f656871-7f26-4816-932c-326222105302-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.249477 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.252888 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea104124-19ad-4aaa-8ee5-cbf24111ed4f","Type":"ContainerStarted","Data":"6d3f5926fca33558c6e2b96aeb30eae016be1857ecb1c3c07825deed455bad7c"} Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.253101 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerName="ceilometer-central-agent" containerID="cri-o://e2e70bc569a7569c022be93e464015b05f77c409760e111d8464bc1a2513f1b1" gracePeriod=30 Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.253134 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerName="proxy-httpd" containerID="cri-o://6d3f5926fca33558c6e2b96aeb30eae016be1857ecb1c3c07825deed455bad7c" gracePeriod=30 Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.253154 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.253163 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerName="sg-core" containerID="cri-o://0cbc4bad70b8310b16e320ddf5ec62e945616f5ced701471d648bb8da57dd93b" gracePeriod=30 Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.253215 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerName="ceilometer-notification-agent" containerID="cri-o://0355a01dd916432a0f9b633efa09d905adadcb29d67e95267b540614d88aa257" gracePeriod=30 Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.284485 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.586103754 podStartE2EDuration="6.284454651s" podCreationTimestamp="2025-12-02 20:22:12 +0000 UTC" firstStartedPulling="2025-12-02 20:22:13.137218503 +0000 UTC m=+1468.438125998" lastFinishedPulling="2025-12-02 20:22:16.8355694 +0000 UTC m=+1472.136476895" observedRunningTime="2025-12-02 20:22:18.282162003 +0000 UTC m=+1473.583069488" watchObservedRunningTime="2025-12-02 20:22:18.284454651 +0000 UTC m=+1473.585362146" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.309565 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.323175 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.346251 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 20:22:18 crc kubenswrapper[4807]: E1202 20:22:18.346864 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f656871-7f26-4816-932c-326222105302" containerName="cinder-api-log" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.346883 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f656871-7f26-4816-932c-326222105302" containerName="cinder-api-log" Dec 02 20:22:18 crc kubenswrapper[4807]: E1202 20:22:18.346943 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f656871-7f26-4816-932c-326222105302" containerName="cinder-api" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.346949 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f656871-7f26-4816-932c-326222105302" containerName="cinder-api" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.347219 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f656871-7f26-4816-932c-326222105302" containerName="cinder-api" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.347235 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f656871-7f26-4816-932c-326222105302" containerName="cinder-api-log" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.348528 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.351152 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.352104 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.352465 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.357690 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.518121 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-scripts\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.518192 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.518225 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j6tj\" (UniqueName: \"kubernetes.io/projected/06947941-0c96-4330-b2f7-bbc193dcdf61-kube-api-access-4j6tj\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.518278 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06947941-0c96-4330-b2f7-bbc193dcdf61-logs\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.518306 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06947941-0c96-4330-b2f7-bbc193dcdf61-etc-machine-id\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.518328 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-config-data-custom\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.518389 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-public-tls-certs\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.518446 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-config-data\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.518489 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.621121 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.621191 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j6tj\" (UniqueName: \"kubernetes.io/projected/06947941-0c96-4330-b2f7-bbc193dcdf61-kube-api-access-4j6tj\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.621267 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06947941-0c96-4330-b2f7-bbc193dcdf61-logs\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.621307 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06947941-0c96-4330-b2f7-bbc193dcdf61-etc-machine-id\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.621335 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-config-data-custom\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.621393 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-public-tls-certs\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.621449 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-config-data\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.621493 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.621536 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-scripts\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.624946 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06947941-0c96-4330-b2f7-bbc193dcdf61-etc-machine-id\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.625315 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06947941-0c96-4330-b2f7-bbc193dcdf61-logs\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.629597 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-config-data-custom\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.630242 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.631193 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-public-tls-certs\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.631343 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-config-data\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.631966 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.640675 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06947941-0c96-4330-b2f7-bbc193dcdf61-scripts\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.643301 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j6tj\" (UniqueName: \"kubernetes.io/projected/06947941-0c96-4330-b2f7-bbc193dcdf61-kube-api-access-4j6tj\") pod \"cinder-api-0\" (UID: \"06947941-0c96-4330-b2f7-bbc193dcdf61\") " pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.712609 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 20:22:18 crc kubenswrapper[4807]: I1202 20:22:18.990388 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f656871-7f26-4816-932c-326222105302" path="/var/lib/kubelet/pods/4f656871-7f26-4816-932c-326222105302/volumes" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.255177 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.273816 4807 generic.go:334] "Generic (PLEG): container finished" podID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerID="6d3f5926fca33558c6e2b96aeb30eae016be1857ecb1c3c07825deed455bad7c" exitCode=0 Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.274340 4807 generic.go:334] "Generic (PLEG): container finished" podID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerID="0cbc4bad70b8310b16e320ddf5ec62e945616f5ced701471d648bb8da57dd93b" exitCode=2 Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.274353 4807 generic.go:334] "Generic (PLEG): container finished" podID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerID="0355a01dd916432a0f9b633efa09d905adadcb29d67e95267b540614d88aa257" exitCode=0 Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.274363 4807 generic.go:334] "Generic (PLEG): container finished" podID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerID="e2e70bc569a7569c022be93e464015b05f77c409760e111d8464bc1a2513f1b1" exitCode=0 Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.273882 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea104124-19ad-4aaa-8ee5-cbf24111ed4f","Type":"ContainerDied","Data":"6d3f5926fca33558c6e2b96aeb30eae016be1857ecb1c3c07825deed455bad7c"} Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.274417 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea104124-19ad-4aaa-8ee5-cbf24111ed4f","Type":"ContainerDied","Data":"0cbc4bad70b8310b16e320ddf5ec62e945616f5ced701471d648bb8da57dd93b"} Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.274439 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea104124-19ad-4aaa-8ee5-cbf24111ed4f","Type":"ContainerDied","Data":"0355a01dd916432a0f9b633efa09d905adadcb29d67e95267b540614d88aa257"} Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.274452 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea104124-19ad-4aaa-8ee5-cbf24111ed4f","Type":"ContainerDied","Data":"e2e70bc569a7569c022be93e464015b05f77c409760e111d8464bc1a2513f1b1"} Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.274463 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea104124-19ad-4aaa-8ee5-cbf24111ed4f","Type":"ContainerDied","Data":"fcc3fa2edd102be421e45b8e6bc5ed3a53ff7541e4113976033053cc9bead6f6"} Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.274481 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcc3fa2edd102be421e45b8e6bc5ed3a53ff7541e4113976033053cc9bead6f6" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.492368 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.548318 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-sg-core-conf-yaml\") pod \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.548404 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-config-data\") pod \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.548441 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-combined-ca-bundle\") pod \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.548511 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-run-httpd\") pod \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.548566 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-scripts\") pod \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.548588 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6mlt\" (UniqueName: \"kubernetes.io/projected/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-kube-api-access-j6mlt\") pod \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.548616 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-log-httpd\") pod \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\" (UID: \"ea104124-19ad-4aaa-8ee5-cbf24111ed4f\") " Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.549683 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ea104124-19ad-4aaa-8ee5-cbf24111ed4f" (UID: "ea104124-19ad-4aaa-8ee5-cbf24111ed4f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.549922 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ea104124-19ad-4aaa-8ee5-cbf24111ed4f" (UID: "ea104124-19ad-4aaa-8ee5-cbf24111ed4f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.556880 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-scripts" (OuterVolumeSpecName: "scripts") pod "ea104124-19ad-4aaa-8ee5-cbf24111ed4f" (UID: "ea104124-19ad-4aaa-8ee5-cbf24111ed4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.563924 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-kube-api-access-j6mlt" (OuterVolumeSpecName: "kube-api-access-j6mlt") pod "ea104124-19ad-4aaa-8ee5-cbf24111ed4f" (UID: "ea104124-19ad-4aaa-8ee5-cbf24111ed4f"). InnerVolumeSpecName "kube-api-access-j6mlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.656237 4807 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.656283 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.656297 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6mlt\" (UniqueName: \"kubernetes.io/projected/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-kube-api-access-j6mlt\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.656311 4807 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.678914 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ea104124-19ad-4aaa-8ee5-cbf24111ed4f" (UID: "ea104124-19ad-4aaa-8ee5-cbf24111ed4f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.731900 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea104124-19ad-4aaa-8ee5-cbf24111ed4f" (UID: "ea104124-19ad-4aaa-8ee5-cbf24111ed4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.759593 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.759840 4807 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.768847 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.772022 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-config-data" (OuterVolumeSpecName: "config-data") pod "ea104124-19ad-4aaa-8ee5-cbf24111ed4f" (UID: "ea104124-19ad-4aaa-8ee5-cbf24111ed4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.862609 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea104124-19ad-4aaa-8ee5-cbf24111ed4f-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.963704 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-config-data\") pod \"95945b94-12eb-4077-b233-45c5c2b6b51d\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.963872 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqqhl\" (UniqueName: \"kubernetes.io/projected/95945b94-12eb-4077-b233-45c5c2b6b51d-kube-api-access-jqqhl\") pod \"95945b94-12eb-4077-b233-45c5c2b6b51d\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.963926 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-combined-ca-bundle\") pod \"95945b94-12eb-4077-b233-45c5c2b6b51d\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.964014 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95945b94-12eb-4077-b233-45c5c2b6b51d-logs\") pod \"95945b94-12eb-4077-b233-45c5c2b6b51d\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.964054 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-custom-prometheus-ca\") pod \"95945b94-12eb-4077-b233-45c5c2b6b51d\" (UID: \"95945b94-12eb-4077-b233-45c5c2b6b51d\") " Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.964551 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95945b94-12eb-4077-b233-45c5c2b6b51d-logs" (OuterVolumeSpecName: "logs") pod "95945b94-12eb-4077-b233-45c5c2b6b51d" (UID: "95945b94-12eb-4077-b233-45c5c2b6b51d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:22:19 crc kubenswrapper[4807]: I1202 20:22:19.970635 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95945b94-12eb-4077-b233-45c5c2b6b51d-kube-api-access-jqqhl" (OuterVolumeSpecName: "kube-api-access-jqqhl") pod "95945b94-12eb-4077-b233-45c5c2b6b51d" (UID: "95945b94-12eb-4077-b233-45c5c2b6b51d"). InnerVolumeSpecName "kube-api-access-jqqhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.001839 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "95945b94-12eb-4077-b233-45c5c2b6b51d" (UID: "95945b94-12eb-4077-b233-45c5c2b6b51d"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.014921 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95945b94-12eb-4077-b233-45c5c2b6b51d" (UID: "95945b94-12eb-4077-b233-45c5c2b6b51d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.058304 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-config-data" (OuterVolumeSpecName: "config-data") pod "95945b94-12eb-4077-b233-45c5c2b6b51d" (UID: "95945b94-12eb-4077-b233-45c5c2b6b51d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.068866 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqqhl\" (UniqueName: \"kubernetes.io/projected/95945b94-12eb-4077-b233-45c5c2b6b51d-kube-api-access-jqqhl\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.068916 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.068925 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95945b94-12eb-4077-b233-45c5c2b6b51d-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.068936 4807 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.068944 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95945b94-12eb-4077-b233-45c5c2b6b51d-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.303250 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"06947941-0c96-4330-b2f7-bbc193dcdf61","Type":"ContainerStarted","Data":"976cce0bdaa41adb846eaa92c2fc75ce5fe811aeb51b6818b57d4039b6babfe2"} Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.303315 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"06947941-0c96-4330-b2f7-bbc193dcdf61","Type":"ContainerStarted","Data":"adde66aa3c5110423cc583f9307b61764233744f670ba1e9d2bb08515d2ce9b2"} Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.304977 4807 generic.go:334] "Generic (PLEG): container finished" podID="95945b94-12eb-4077-b233-45c5c2b6b51d" containerID="c87c2f0302a130cf41abe6822b2865ba1dc27741a63bce4c74e6b79bbbf2b18f" exitCode=0 Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.305096 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.307120 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.307244 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"95945b94-12eb-4077-b233-45c5c2b6b51d","Type":"ContainerDied","Data":"c87c2f0302a130cf41abe6822b2865ba1dc27741a63bce4c74e6b79bbbf2b18f"} Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.307321 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"95945b94-12eb-4077-b233-45c5c2b6b51d","Type":"ContainerDied","Data":"edc02ce2c1fe47105e66528d22ad724c1e9087d8c9124f05de80f5c3573abba4"} Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.307344 4807 scope.go:117] "RemoveContainer" containerID="c87c2f0302a130cf41abe6822b2865ba1dc27741a63bce4c74e6b79bbbf2b18f" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.360524 4807 scope.go:117] "RemoveContainer" containerID="c87c2f0302a130cf41abe6822b2865ba1dc27741a63bce4c74e6b79bbbf2b18f" Dec 02 20:22:20 crc kubenswrapper[4807]: E1202 20:22:20.364905 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c87c2f0302a130cf41abe6822b2865ba1dc27741a63bce4c74e6b79bbbf2b18f\": container with ID starting with c87c2f0302a130cf41abe6822b2865ba1dc27741a63bce4c74e6b79bbbf2b18f not found: ID does not exist" containerID="c87c2f0302a130cf41abe6822b2865ba1dc27741a63bce4c74e6b79bbbf2b18f" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.364959 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c87c2f0302a130cf41abe6822b2865ba1dc27741a63bce4c74e6b79bbbf2b18f"} err="failed to get container status \"c87c2f0302a130cf41abe6822b2865ba1dc27741a63bce4c74e6b79bbbf2b18f\": rpc error: code = NotFound desc = could not find container \"c87c2f0302a130cf41abe6822b2865ba1dc27741a63bce4c74e6b79bbbf2b18f\": container with ID starting with c87c2f0302a130cf41abe6822b2865ba1dc27741a63bce4c74e6b79bbbf2b18f not found: ID does not exist" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.386329 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.402212 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.415636 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 20:22:20 crc kubenswrapper[4807]: E1202 20:22:20.416533 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95945b94-12eb-4077-b233-45c5c2b6b51d" containerName="watcher-decision-engine" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.416559 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="95945b94-12eb-4077-b233-45c5c2b6b51d" containerName="watcher-decision-engine" Dec 02 20:22:20 crc kubenswrapper[4807]: E1202 20:22:20.416593 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerName="proxy-httpd" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.416599 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerName="proxy-httpd" Dec 02 20:22:20 crc kubenswrapper[4807]: E1202 20:22:20.416619 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerName="ceilometer-central-agent" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.416626 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerName="ceilometer-central-agent" Dec 02 20:22:20 crc kubenswrapper[4807]: E1202 20:22:20.416643 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerName="ceilometer-notification-agent" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.416650 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerName="ceilometer-notification-agent" Dec 02 20:22:20 crc kubenswrapper[4807]: E1202 20:22:20.416663 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerName="sg-core" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.416670 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerName="sg-core" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.416897 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerName="ceilometer-notification-agent" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.416913 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerName="proxy-httpd" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.416922 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerName="ceilometer-central-agent" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.416936 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" containerName="sg-core" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.416949 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="95945b94-12eb-4077-b233-45c5c2b6b51d" containerName="watcher-decision-engine" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.418187 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.422022 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.429014 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.443683 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.459774 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.467821 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.472998 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.475984 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.477804 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.480243 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.581128 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05674c10-e8c2-4ab8-9d80-185c9b814c9c-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"05674c10-e8c2-4ab8-9d80-185c9b814c9c\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.581500 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-log-httpd\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.581552 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/05674c10-e8c2-4ab8-9d80-185c9b814c9c-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"05674c10-e8c2-4ab8-9d80-185c9b814c9c\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.581576 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-config-data\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.581620 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxkds\" (UniqueName: \"kubernetes.io/projected/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-kube-api-access-gxkds\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.581642 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-scripts\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.581663 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mpnq\" (UniqueName: \"kubernetes.io/projected/05674c10-e8c2-4ab8-9d80-185c9b814c9c-kube-api-access-2mpnq\") pod \"watcher-decision-engine-0\" (UID: \"05674c10-e8c2-4ab8-9d80-185c9b814c9c\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.581731 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-run-httpd\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.581757 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05674c10-e8c2-4ab8-9d80-185c9b814c9c-config-data\") pod \"watcher-decision-engine-0\" (UID: \"05674c10-e8c2-4ab8-9d80-185c9b814c9c\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.581787 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.581806 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.581832 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05674c10-e8c2-4ab8-9d80-185c9b814c9c-logs\") pod \"watcher-decision-engine-0\" (UID: \"05674c10-e8c2-4ab8-9d80-185c9b814c9c\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.684574 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/05674c10-e8c2-4ab8-9d80-185c9b814c9c-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"05674c10-e8c2-4ab8-9d80-185c9b814c9c\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.684660 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-config-data\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.684755 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxkds\" (UniqueName: \"kubernetes.io/projected/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-kube-api-access-gxkds\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.684791 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-scripts\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.684827 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mpnq\" (UniqueName: \"kubernetes.io/projected/05674c10-e8c2-4ab8-9d80-185c9b814c9c-kube-api-access-2mpnq\") pod \"watcher-decision-engine-0\" (UID: \"05674c10-e8c2-4ab8-9d80-185c9b814c9c\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.684891 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-run-httpd\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.684919 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05674c10-e8c2-4ab8-9d80-185c9b814c9c-config-data\") pod \"watcher-decision-engine-0\" (UID: \"05674c10-e8c2-4ab8-9d80-185c9b814c9c\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.684956 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.684981 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.685003 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05674c10-e8c2-4ab8-9d80-185c9b814c9c-logs\") pod \"watcher-decision-engine-0\" (UID: \"05674c10-e8c2-4ab8-9d80-185c9b814c9c\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.685254 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05674c10-e8c2-4ab8-9d80-185c9b814c9c-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"05674c10-e8c2-4ab8-9d80-185c9b814c9c\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.689401 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-run-httpd\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.689502 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-log-httpd\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.692528 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05674c10-e8c2-4ab8-9d80-185c9b814c9c-logs\") pod \"watcher-decision-engine-0\" (UID: \"05674c10-e8c2-4ab8-9d80-185c9b814c9c\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.685346 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-log-httpd\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.699679 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05674c10-e8c2-4ab8-9d80-185c9b814c9c-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"05674c10-e8c2-4ab8-9d80-185c9b814c9c\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.702855 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/05674c10-e8c2-4ab8-9d80-185c9b814c9c-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"05674c10-e8c2-4ab8-9d80-185c9b814c9c\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.703568 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05674c10-e8c2-4ab8-9d80-185c9b814c9c-config-data\") pod \"watcher-decision-engine-0\" (UID: \"05674c10-e8c2-4ab8-9d80-185c9b814c9c\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.704017 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-scripts\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.708632 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.708936 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.710065 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-config-data\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.717480 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxkds\" (UniqueName: \"kubernetes.io/projected/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-kube-api-access-gxkds\") pod \"ceilometer-0\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " pod="openstack/ceilometer-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.718461 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mpnq\" (UniqueName: \"kubernetes.io/projected/05674c10-e8c2-4ab8-9d80-185c9b814c9c-kube-api-access-2mpnq\") pod \"watcher-decision-engine-0\" (UID: \"05674c10-e8c2-4ab8-9d80-185c9b814c9c\") " pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.747579 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 02 20:22:20 crc kubenswrapper[4807]: I1202 20:22:20.796416 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.008851 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95945b94-12eb-4077-b233-45c5c2b6b51d" path="/var/lib/kubelet/pods/95945b94-12eb-4077-b233-45c5c2b6b51d/volumes" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.009812 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea104124-19ad-4aaa-8ee5-cbf24111ed4f" path="/var/lib/kubelet/pods/ea104124-19ad-4aaa-8ee5-cbf24111ed4f/volumes" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.336102 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"06947941-0c96-4330-b2f7-bbc193dcdf61","Type":"ContainerStarted","Data":"c62ef079e31d1986f2db7296f4a2f6e9963a77472fa8a02a1d0ab2a4a6fd1366"} Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.336500 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.377787 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.416881 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-fch86"] Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.418412 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fch86" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.461794 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fch86"] Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.462275 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.462252719 podStartE2EDuration="3.462252719s" podCreationTimestamp="2025-12-02 20:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:22:21.3959628 +0000 UTC m=+1476.696870295" watchObservedRunningTime="2025-12-02 20:22:21.462252719 +0000 UTC m=+1476.763160214" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.524793 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-s4gh8"] Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.526186 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-s4gh8" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.553699 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-s4gh8"] Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.611846 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-fe50-account-create-update-rpfxl"] Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.615273 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fe50-account-create-update-rpfxl" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.618797 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm7gg\" (UniqueName: \"kubernetes.io/projected/54376bb6-4987-4ef6-8814-51fa5f95e7bb-kube-api-access-hm7gg\") pod \"nova-api-db-create-fch86\" (UID: \"54376bb6-4987-4ef6-8814-51fa5f95e7bb\") " pod="openstack/nova-api-db-create-fch86" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.618852 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c1e20d0-372b-45be-9e7a-eed8a5264b08-operator-scripts\") pod \"nova-cell0-db-create-s4gh8\" (UID: \"7c1e20d0-372b-45be-9e7a-eed8a5264b08\") " pod="openstack/nova-cell0-db-create-s4gh8" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.618875 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69xjd\" (UniqueName: \"kubernetes.io/projected/7c1e20d0-372b-45be-9e7a-eed8a5264b08-kube-api-access-69xjd\") pod \"nova-cell0-db-create-s4gh8\" (UID: \"7c1e20d0-372b-45be-9e7a-eed8a5264b08\") " pod="openstack/nova-cell0-db-create-s4gh8" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.618910 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54376bb6-4987-4ef6-8814-51fa5f95e7bb-operator-scripts\") pod \"nova-api-db-create-fch86\" (UID: \"54376bb6-4987-4ef6-8814-51fa5f95e7bb\") " pod="openstack/nova-api-db-create-fch86" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.638127 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.681794 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fe50-account-create-update-rpfxl"] Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.694806 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-ljt26"] Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.696344 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ljt26" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.709555 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ljt26"] Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.716929 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.724232 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54376bb6-4987-4ef6-8814-51fa5f95e7bb-operator-scripts\") pod \"nova-api-db-create-fch86\" (UID: \"54376bb6-4987-4ef6-8814-51fa5f95e7bb\") " pod="openstack/nova-api-db-create-fch86" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.724969 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0864d50-612d-47a3-bc0f-59883303dedf-operator-scripts\") pod \"nova-api-fe50-account-create-update-rpfxl\" (UID: \"d0864d50-612d-47a3-bc0f-59883303dedf\") " pod="openstack/nova-api-fe50-account-create-update-rpfxl" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.725119 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbqpc\" (UniqueName: \"kubernetes.io/projected/d0864d50-612d-47a3-bc0f-59883303dedf-kube-api-access-mbqpc\") pod \"nova-api-fe50-account-create-update-rpfxl\" (UID: \"d0864d50-612d-47a3-bc0f-59883303dedf\") " pod="openstack/nova-api-fe50-account-create-update-rpfxl" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.725261 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm7gg\" (UniqueName: \"kubernetes.io/projected/54376bb6-4987-4ef6-8814-51fa5f95e7bb-kube-api-access-hm7gg\") pod \"nova-api-db-create-fch86\" (UID: \"54376bb6-4987-4ef6-8814-51fa5f95e7bb\") " pod="openstack/nova-api-db-create-fch86" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.725393 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c1e20d0-372b-45be-9e7a-eed8a5264b08-operator-scripts\") pod \"nova-cell0-db-create-s4gh8\" (UID: \"7c1e20d0-372b-45be-9e7a-eed8a5264b08\") " pod="openstack/nova-cell0-db-create-s4gh8" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.725511 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69xjd\" (UniqueName: \"kubernetes.io/projected/7c1e20d0-372b-45be-9e7a-eed8a5264b08-kube-api-access-69xjd\") pod \"nova-cell0-db-create-s4gh8\" (UID: \"7c1e20d0-372b-45be-9e7a-eed8a5264b08\") " pod="openstack/nova-cell0-db-create-s4gh8" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.727186 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54376bb6-4987-4ef6-8814-51fa5f95e7bb-operator-scripts\") pod \"nova-api-db-create-fch86\" (UID: \"54376bb6-4987-4ef6-8814-51fa5f95e7bb\") " pod="openstack/nova-api-db-create-fch86" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.734535 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c1e20d0-372b-45be-9e7a-eed8a5264b08-operator-scripts\") pod \"nova-cell0-db-create-s4gh8\" (UID: \"7c1e20d0-372b-45be-9e7a-eed8a5264b08\") " pod="openstack/nova-cell0-db-create-s4gh8" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.754748 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69xjd\" (UniqueName: \"kubernetes.io/projected/7c1e20d0-372b-45be-9e7a-eed8a5264b08-kube-api-access-69xjd\") pod \"nova-cell0-db-create-s4gh8\" (UID: \"7c1e20d0-372b-45be-9e7a-eed8a5264b08\") " pod="openstack/nova-cell0-db-create-s4gh8" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.759577 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm7gg\" (UniqueName: \"kubernetes.io/projected/54376bb6-4987-4ef6-8814-51fa5f95e7bb-kube-api-access-hm7gg\") pod \"nova-api-db-create-fch86\" (UID: \"54376bb6-4987-4ef6-8814-51fa5f95e7bb\") " pod="openstack/nova-api-db-create-fch86" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.813925 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4f38-account-create-update-wbhvj"] Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.827235 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0864d50-612d-47a3-bc0f-59883303dedf-operator-scripts\") pod \"nova-api-fe50-account-create-update-rpfxl\" (UID: \"d0864d50-612d-47a3-bc0f-59883303dedf\") " pod="openstack/nova-api-fe50-account-create-update-rpfxl" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.827313 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbqpc\" (UniqueName: \"kubernetes.io/projected/d0864d50-612d-47a3-bc0f-59883303dedf-kube-api-access-mbqpc\") pod \"nova-api-fe50-account-create-update-rpfxl\" (UID: \"d0864d50-612d-47a3-bc0f-59883303dedf\") " pod="openstack/nova-api-fe50-account-create-update-rpfxl" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.827488 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bde51b6-13ef-4b4b-b126-b3effc7dc8ea-operator-scripts\") pod \"nova-cell1-db-create-ljt26\" (UID: \"7bde51b6-13ef-4b4b-b126-b3effc7dc8ea\") " pod="openstack/nova-cell1-db-create-ljt26" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.827571 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfw9x\" (UniqueName: \"kubernetes.io/projected/7bde51b6-13ef-4b4b-b126-b3effc7dc8ea-kube-api-access-qfw9x\") pod \"nova-cell1-db-create-ljt26\" (UID: \"7bde51b6-13ef-4b4b-b126-b3effc7dc8ea\") " pod="openstack/nova-cell1-db-create-ljt26" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.828490 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0864d50-612d-47a3-bc0f-59883303dedf-operator-scripts\") pod \"nova-api-fe50-account-create-update-rpfxl\" (UID: \"d0864d50-612d-47a3-bc0f-59883303dedf\") " pod="openstack/nova-api-fe50-account-create-update-rpfxl" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.831445 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4f38-account-create-update-wbhvj" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.836229 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.848606 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4f38-account-create-update-wbhvj"] Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.868684 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fch86" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.893545 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-s4gh8" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.908430 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbqpc\" (UniqueName: \"kubernetes.io/projected/d0864d50-612d-47a3-bc0f-59883303dedf-kube-api-access-mbqpc\") pod \"nova-api-fe50-account-create-update-rpfxl\" (UID: \"d0864d50-612d-47a3-bc0f-59883303dedf\") " pod="openstack/nova-api-fe50-account-create-update-rpfxl" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.931041 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bde51b6-13ef-4b4b-b126-b3effc7dc8ea-operator-scripts\") pod \"nova-cell1-db-create-ljt26\" (UID: \"7bde51b6-13ef-4b4b-b126-b3effc7dc8ea\") " pod="openstack/nova-cell1-db-create-ljt26" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.931122 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fl7h\" (UniqueName: \"kubernetes.io/projected/04ab6b01-20b7-4320-aa91-2deecbdac66a-kube-api-access-5fl7h\") pod \"nova-cell0-4f38-account-create-update-wbhvj\" (UID: \"04ab6b01-20b7-4320-aa91-2deecbdac66a\") " pod="openstack/nova-cell0-4f38-account-create-update-wbhvj" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.931177 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfw9x\" (UniqueName: \"kubernetes.io/projected/7bde51b6-13ef-4b4b-b126-b3effc7dc8ea-kube-api-access-qfw9x\") pod \"nova-cell1-db-create-ljt26\" (UID: \"7bde51b6-13ef-4b4b-b126-b3effc7dc8ea\") " pod="openstack/nova-cell1-db-create-ljt26" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.931215 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04ab6b01-20b7-4320-aa91-2deecbdac66a-operator-scripts\") pod \"nova-cell0-4f38-account-create-update-wbhvj\" (UID: \"04ab6b01-20b7-4320-aa91-2deecbdac66a\") " pod="openstack/nova-cell0-4f38-account-create-update-wbhvj" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.932030 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bde51b6-13ef-4b4b-b126-b3effc7dc8ea-operator-scripts\") pod \"nova-cell1-db-create-ljt26\" (UID: \"7bde51b6-13ef-4b4b-b126-b3effc7dc8ea\") " pod="openstack/nova-cell1-db-create-ljt26" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.972477 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fe50-account-create-update-rpfxl" Dec 02 20:22:21 crc kubenswrapper[4807]: I1202 20:22:21.973467 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfw9x\" (UniqueName: \"kubernetes.io/projected/7bde51b6-13ef-4b4b-b126-b3effc7dc8ea-kube-api-access-qfw9x\") pod \"nova-cell1-db-create-ljt26\" (UID: \"7bde51b6-13ef-4b4b-b126-b3effc7dc8ea\") " pod="openstack/nova-cell1-db-create-ljt26" Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.032982 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04ab6b01-20b7-4320-aa91-2deecbdac66a-operator-scripts\") pod \"nova-cell0-4f38-account-create-update-wbhvj\" (UID: \"04ab6b01-20b7-4320-aa91-2deecbdac66a\") " pod="openstack/nova-cell0-4f38-account-create-update-wbhvj" Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.033136 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fl7h\" (UniqueName: \"kubernetes.io/projected/04ab6b01-20b7-4320-aa91-2deecbdac66a-kube-api-access-5fl7h\") pod \"nova-cell0-4f38-account-create-update-wbhvj\" (UID: \"04ab6b01-20b7-4320-aa91-2deecbdac66a\") " pod="openstack/nova-cell0-4f38-account-create-update-wbhvj" Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.033954 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04ab6b01-20b7-4320-aa91-2deecbdac66a-operator-scripts\") pod \"nova-cell0-4f38-account-create-update-wbhvj\" (UID: \"04ab6b01-20b7-4320-aa91-2deecbdac66a\") " pod="openstack/nova-cell0-4f38-account-create-update-wbhvj" Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.069560 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fl7h\" (UniqueName: \"kubernetes.io/projected/04ab6b01-20b7-4320-aa91-2deecbdac66a-kube-api-access-5fl7h\") pod \"nova-cell0-4f38-account-create-update-wbhvj\" (UID: \"04ab6b01-20b7-4320-aa91-2deecbdac66a\") " pod="openstack/nova-cell0-4f38-account-create-update-wbhvj" Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.070892 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ljt26" Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.094955 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-fb4f-account-create-update-rnlg5"] Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.099459 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fb4f-account-create-update-rnlg5" Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.103783 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.108812 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4f38-account-create-update-wbhvj" Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.118260 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fb4f-account-create-update-rnlg5"] Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.239209 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac62b3e0-8a0f-4953-b543-4f699b3bb7cb-operator-scripts\") pod \"nova-cell1-fb4f-account-create-update-rnlg5\" (UID: \"ac62b3e0-8a0f-4953-b543-4f699b3bb7cb\") " pod="openstack/nova-cell1-fb4f-account-create-update-rnlg5" Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.239492 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq7b4\" (UniqueName: \"kubernetes.io/projected/ac62b3e0-8a0f-4953-b543-4f699b3bb7cb-kube-api-access-dq7b4\") pod \"nova-cell1-fb4f-account-create-update-rnlg5\" (UID: \"ac62b3e0-8a0f-4953-b543-4f699b3bb7cb\") " pod="openstack/nova-cell1-fb4f-account-create-update-rnlg5" Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.341992 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq7b4\" (UniqueName: \"kubernetes.io/projected/ac62b3e0-8a0f-4953-b543-4f699b3bb7cb-kube-api-access-dq7b4\") pod \"nova-cell1-fb4f-account-create-update-rnlg5\" (UID: \"ac62b3e0-8a0f-4953-b543-4f699b3bb7cb\") " pod="openstack/nova-cell1-fb4f-account-create-update-rnlg5" Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.342632 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac62b3e0-8a0f-4953-b543-4f699b3bb7cb-operator-scripts\") pod \"nova-cell1-fb4f-account-create-update-rnlg5\" (UID: \"ac62b3e0-8a0f-4953-b543-4f699b3bb7cb\") " pod="openstack/nova-cell1-fb4f-account-create-update-rnlg5" Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.343751 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac62b3e0-8a0f-4953-b543-4f699b3bb7cb-operator-scripts\") pod \"nova-cell1-fb4f-account-create-update-rnlg5\" (UID: \"ac62b3e0-8a0f-4953-b543-4f699b3bb7cb\") " pod="openstack/nova-cell1-fb4f-account-create-update-rnlg5" Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.387122 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"05674c10-e8c2-4ab8-9d80-185c9b814c9c","Type":"ContainerStarted","Data":"647c5c22cb3f26ac926725bb500906d235dc56b28a2e0d29b00e05b5b0b70d41"} Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.387880 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq7b4\" (UniqueName: \"kubernetes.io/projected/ac62b3e0-8a0f-4953-b543-4f699b3bb7cb-kube-api-access-dq7b4\") pod \"nova-cell1-fb4f-account-create-update-rnlg5\" (UID: \"ac62b3e0-8a0f-4953-b543-4f699b3bb7cb\") " pod="openstack/nova-cell1-fb4f-account-create-update-rnlg5" Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.398154 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66","Type":"ContainerStarted","Data":"452ec1b8016f9165eee82e3b2508137100dba9ed8833f67dcccb8beefc41a7d9"} Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.426059 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fb4f-account-create-update-rnlg5" Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.671491 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-s4gh8"] Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.832838 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fch86"] Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.939403 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fe50-account-create-update-rpfxl"] Dec 02 20:22:22 crc kubenswrapper[4807]: I1202 20:22:22.962472 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4f38-account-create-update-wbhvj"] Dec 02 20:22:23 crc kubenswrapper[4807]: I1202 20:22:23.001297 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ljt26"] Dec 02 20:22:23 crc kubenswrapper[4807]: I1202 20:22:23.014692 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-fb4f-account-create-update-rnlg5"] Dec 02 20:22:23 crc kubenswrapper[4807]: W1202 20:22:23.041179 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac62b3e0_8a0f_4953_b543_4f699b3bb7cb.slice/crio-18cd746fc5ad25e89ed524db0457ec7909df49dfeb7179a0a594ba9cae188ee5 WatchSource:0}: Error finding container 18cd746fc5ad25e89ed524db0457ec7909df49dfeb7179a0a594ba9cae188ee5: Status 404 returned error can't find the container with id 18cd746fc5ad25e89ed524db0457ec7909df49dfeb7179a0a594ba9cae188ee5 Dec 02 20:22:23 crc kubenswrapper[4807]: I1202 20:22:23.272358 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:23 crc kubenswrapper[4807]: I1202 20:22:23.414345 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ljt26" event={"ID":"7bde51b6-13ef-4b4b-b126-b3effc7dc8ea","Type":"ContainerStarted","Data":"ce664c0c8e6898acac67be7b9bb39623e803dbe937dd6be6364930c9f32308e5"} Dec 02 20:22:23 crc kubenswrapper[4807]: I1202 20:22:23.418173 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fch86" event={"ID":"54376bb6-4987-4ef6-8814-51fa5f95e7bb","Type":"ContainerStarted","Data":"23e4a50af1603e639b20dd72232f60c8328d4fa771e6df0a51aab9420e067a4f"} Dec 02 20:22:23 crc kubenswrapper[4807]: I1202 20:22:23.426232 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-s4gh8" event={"ID":"7c1e20d0-372b-45be-9e7a-eed8a5264b08","Type":"ContainerStarted","Data":"25e93ffe61c6b7ad31ce6232b73d2ae02283586d0d18b862cabbf5184f6d06ea"} Dec 02 20:22:23 crc kubenswrapper[4807]: I1202 20:22:23.426331 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-s4gh8" event={"ID":"7c1e20d0-372b-45be-9e7a-eed8a5264b08","Type":"ContainerStarted","Data":"10ef11c1e2b5baac0b840b88857cc680a434b7314192fe690e869a6802e937ab"} Dec 02 20:22:23 crc kubenswrapper[4807]: I1202 20:22:23.433435 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4f38-account-create-update-wbhvj" event={"ID":"04ab6b01-20b7-4320-aa91-2deecbdac66a","Type":"ContainerStarted","Data":"12ea2e61da72edd74bf5dd3b60f7e231f82b65975b86c076d53b68e9f98d7001"} Dec 02 20:22:23 crc kubenswrapper[4807]: I1202 20:22:23.437256 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fe50-account-create-update-rpfxl" event={"ID":"d0864d50-612d-47a3-bc0f-59883303dedf","Type":"ContainerStarted","Data":"7050e2d60ae8d9f1628f5e61a297e30e4855ff0ac3a80a08a62d4f6491853e63"} Dec 02 20:22:23 crc kubenswrapper[4807]: I1202 20:22:23.439540 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fb4f-account-create-update-rnlg5" event={"ID":"ac62b3e0-8a0f-4953-b543-4f699b3bb7cb","Type":"ContainerStarted","Data":"18cd746fc5ad25e89ed524db0457ec7909df49dfeb7179a0a594ba9cae188ee5"} Dec 02 20:22:23 crc kubenswrapper[4807]: I1202 20:22:23.445776 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"05674c10-e8c2-4ab8-9d80-185c9b814c9c","Type":"ContainerStarted","Data":"718444795e7499ba25440d566ef565c9d75f73055f8ed5ab8fbd419845d3855e"} Dec 02 20:22:23 crc kubenswrapper[4807]: I1202 20:22:23.462649 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-s4gh8" podStartSLOduration=2.462614532 podStartE2EDuration="2.462614532s" podCreationTimestamp="2025-12-02 20:22:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:22:23.446067493 +0000 UTC m=+1478.746974988" watchObservedRunningTime="2025-12-02 20:22:23.462614532 +0000 UTC m=+1478.763522027" Dec 02 20:22:23 crc kubenswrapper[4807]: I1202 20:22:23.470615 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=3.470592067 podStartE2EDuration="3.470592067s" podCreationTimestamp="2025-12-02 20:22:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:22:23.469861396 +0000 UTC m=+1478.770768891" watchObservedRunningTime="2025-12-02 20:22:23.470592067 +0000 UTC m=+1478.771499582" Dec 02 20:22:24 crc kubenswrapper[4807]: I1202 20:22:24.462166 4807 generic.go:334] "Generic (PLEG): container finished" podID="04ab6b01-20b7-4320-aa91-2deecbdac66a" containerID="62ee5d0af02152db741266410f81a2f85b6e7440da183fb4473f51ef1e9b4011" exitCode=0 Dec 02 20:22:24 crc kubenswrapper[4807]: I1202 20:22:24.462404 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4f38-account-create-update-wbhvj" event={"ID":"04ab6b01-20b7-4320-aa91-2deecbdac66a","Type":"ContainerDied","Data":"62ee5d0af02152db741266410f81a2f85b6e7440da183fb4473f51ef1e9b4011"} Dec 02 20:22:24 crc kubenswrapper[4807]: I1202 20:22:24.473579 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66","Type":"ContainerStarted","Data":"a91b7221a9ef17496b2831a07e93cbc306113c6716495c912d3c40bdeb3437f5"} Dec 02 20:22:24 crc kubenswrapper[4807]: I1202 20:22:24.487050 4807 generic.go:334] "Generic (PLEG): container finished" podID="d0864d50-612d-47a3-bc0f-59883303dedf" containerID="313f2233436bcc17c17ac5ac71d7bd4a4ca98714cfeecb3c919ac2d292748484" exitCode=0 Dec 02 20:22:24 crc kubenswrapper[4807]: I1202 20:22:24.487216 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fe50-account-create-update-rpfxl" event={"ID":"d0864d50-612d-47a3-bc0f-59883303dedf","Type":"ContainerDied","Data":"313f2233436bcc17c17ac5ac71d7bd4a4ca98714cfeecb3c919ac2d292748484"} Dec 02 20:22:24 crc kubenswrapper[4807]: I1202 20:22:24.499945 4807 generic.go:334] "Generic (PLEG): container finished" podID="ac62b3e0-8a0f-4953-b543-4f699b3bb7cb" containerID="d7fb0c43f3c59949a33ece3d7f74b9a550ef712b92367d6c8a54c44d706b554a" exitCode=0 Dec 02 20:22:24 crc kubenswrapper[4807]: I1202 20:22:24.500043 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fb4f-account-create-update-rnlg5" event={"ID":"ac62b3e0-8a0f-4953-b543-4f699b3bb7cb","Type":"ContainerDied","Data":"d7fb0c43f3c59949a33ece3d7f74b9a550ef712b92367d6c8a54c44d706b554a"} Dec 02 20:22:24 crc kubenswrapper[4807]: I1202 20:22:24.505761 4807 generic.go:334] "Generic (PLEG): container finished" podID="7bde51b6-13ef-4b4b-b126-b3effc7dc8ea" containerID="0452659f40781e97b757d36f04cd0ff6da79adcabe9b19966bb7e19a067f516e" exitCode=0 Dec 02 20:22:24 crc kubenswrapper[4807]: I1202 20:22:24.505850 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ljt26" event={"ID":"7bde51b6-13ef-4b4b-b126-b3effc7dc8ea","Type":"ContainerDied","Data":"0452659f40781e97b757d36f04cd0ff6da79adcabe9b19966bb7e19a067f516e"} Dec 02 20:22:24 crc kubenswrapper[4807]: I1202 20:22:24.512998 4807 generic.go:334] "Generic (PLEG): container finished" podID="54376bb6-4987-4ef6-8814-51fa5f95e7bb" containerID="5e5173c578ac80bca14a7eeb0505f508f98202bb92c7a97be190a8074c7ab083" exitCode=0 Dec 02 20:22:24 crc kubenswrapper[4807]: I1202 20:22:24.513096 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fch86" event={"ID":"54376bb6-4987-4ef6-8814-51fa5f95e7bb","Type":"ContainerDied","Data":"5e5173c578ac80bca14a7eeb0505f508f98202bb92c7a97be190a8074c7ab083"} Dec 02 20:22:24 crc kubenswrapper[4807]: I1202 20:22:24.521325 4807 generic.go:334] "Generic (PLEG): container finished" podID="7c1e20d0-372b-45be-9e7a-eed8a5264b08" containerID="25e93ffe61c6b7ad31ce6232b73d2ae02283586d0d18b862cabbf5184f6d06ea" exitCode=0 Dec 02 20:22:24 crc kubenswrapper[4807]: I1202 20:22:24.522248 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-s4gh8" event={"ID":"7c1e20d0-372b-45be-9e7a-eed8a5264b08","Type":"ContainerDied","Data":"25e93ffe61c6b7ad31ce6232b73d2ae02283586d0d18b862cabbf5184f6d06ea"} Dec 02 20:22:24 crc kubenswrapper[4807]: I1202 20:22:24.697410 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 20:22:24 crc kubenswrapper[4807]: I1202 20:22:24.698119 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="88b698ce-a6a9-4607-8629-394e3a2d7d29" containerName="glance-log" containerID="cri-o://5da3672237909f2b76e84812636ed8d7d8ec3f5487f44ca0397dbb4ca09ef4da" gracePeriod=30 Dec 02 20:22:24 crc kubenswrapper[4807]: I1202 20:22:24.702777 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="88b698ce-a6a9-4607-8629-394e3a2d7d29" containerName="glance-httpd" containerID="cri-o://5923f4a532f99d9911076926bb80bbf7974024d3d16e2323a9861629b0725eac" gracePeriod=30 Dec 02 20:22:25 crc kubenswrapper[4807]: I1202 20:22:25.535991 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66","Type":"ContainerStarted","Data":"f0fc967027c35c9af4d717f0647556452e3f07dadb2522d9edd32e6232dd7505"} Dec 02 20:22:25 crc kubenswrapper[4807]: I1202 20:22:25.536444 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66","Type":"ContainerStarted","Data":"88f09b99835ddf7ee29bc4a6b2c519c183c21b3c370aca7e13b229ebf50952a1"} Dec 02 20:22:25 crc kubenswrapper[4807]: I1202 20:22:25.540186 4807 generic.go:334] "Generic (PLEG): container finished" podID="88b698ce-a6a9-4607-8629-394e3a2d7d29" containerID="5da3672237909f2b76e84812636ed8d7d8ec3f5487f44ca0397dbb4ca09ef4da" exitCode=143 Dec 02 20:22:25 crc kubenswrapper[4807]: I1202 20:22:25.540253 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"88b698ce-a6a9-4607-8629-394e3a2d7d29","Type":"ContainerDied","Data":"5da3672237909f2b76e84812636ed8d7d8ec3f5487f44ca0397dbb4ca09ef4da"} Dec 02 20:22:25 crc kubenswrapper[4807]: I1202 20:22:25.566105 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 20:22:25 crc kubenswrapper[4807]: I1202 20:22:25.566502 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" containerName="glance-log" containerID="cri-o://af444d5027b97215d0398e19787af2420a5406dd8f3f33a00bd64ff2801c5918" gracePeriod=30 Dec 02 20:22:25 crc kubenswrapper[4807]: I1202 20:22:25.566627 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" containerName="glance-httpd" containerID="cri-o://a1234d2f9bbea41e51f17c78fe72a9d053c485e8d7dd5e87d80b7a256d014711" gracePeriod=30 Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.117969 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ljt26" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.174928 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bde51b6-13ef-4b4b-b126-b3effc7dc8ea-operator-scripts\") pod \"7bde51b6-13ef-4b4b-b126-b3effc7dc8ea\" (UID: \"7bde51b6-13ef-4b4b-b126-b3effc7dc8ea\") " Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.175118 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfw9x\" (UniqueName: \"kubernetes.io/projected/7bde51b6-13ef-4b4b-b126-b3effc7dc8ea-kube-api-access-qfw9x\") pod \"7bde51b6-13ef-4b4b-b126-b3effc7dc8ea\" (UID: \"7bde51b6-13ef-4b4b-b126-b3effc7dc8ea\") " Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.176497 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bde51b6-13ef-4b4b-b126-b3effc7dc8ea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7bde51b6-13ef-4b4b-b126-b3effc7dc8ea" (UID: "7bde51b6-13ef-4b4b-b126-b3effc7dc8ea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.196150 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bde51b6-13ef-4b4b-b126-b3effc7dc8ea-kube-api-access-qfw9x" (OuterVolumeSpecName: "kube-api-access-qfw9x") pod "7bde51b6-13ef-4b4b-b126-b3effc7dc8ea" (UID: "7bde51b6-13ef-4b4b-b126-b3effc7dc8ea"). InnerVolumeSpecName "kube-api-access-qfw9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.281431 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfw9x\" (UniqueName: \"kubernetes.io/projected/7bde51b6-13ef-4b4b-b126-b3effc7dc8ea-kube-api-access-qfw9x\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.281474 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bde51b6-13ef-4b4b-b126-b3effc7dc8ea-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.449401 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fch86" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.462998 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-s4gh8" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.487143 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm7gg\" (UniqueName: \"kubernetes.io/projected/54376bb6-4987-4ef6-8814-51fa5f95e7bb-kube-api-access-hm7gg\") pod \"54376bb6-4987-4ef6-8814-51fa5f95e7bb\" (UID: \"54376bb6-4987-4ef6-8814-51fa5f95e7bb\") " Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.487333 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54376bb6-4987-4ef6-8814-51fa5f95e7bb-operator-scripts\") pod \"54376bb6-4987-4ef6-8814-51fa5f95e7bb\" (UID: \"54376bb6-4987-4ef6-8814-51fa5f95e7bb\") " Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.488599 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54376bb6-4987-4ef6-8814-51fa5f95e7bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54376bb6-4987-4ef6-8814-51fa5f95e7bb" (UID: "54376bb6-4987-4ef6-8814-51fa5f95e7bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.493013 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fe50-account-create-update-rpfxl" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.499663 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54376bb6-4987-4ef6-8814-51fa5f95e7bb-kube-api-access-hm7gg" (OuterVolumeSpecName: "kube-api-access-hm7gg") pod "54376bb6-4987-4ef6-8814-51fa5f95e7bb" (UID: "54376bb6-4987-4ef6-8814-51fa5f95e7bb"). InnerVolumeSpecName "kube-api-access-hm7gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.521990 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4f38-account-create-update-wbhvj" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.533367 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fb4f-account-create-update-rnlg5" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.575013 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ljt26" event={"ID":"7bde51b6-13ef-4b4b-b126-b3effc7dc8ea","Type":"ContainerDied","Data":"ce664c0c8e6898acac67be7b9bb39623e803dbe937dd6be6364930c9f32308e5"} Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.575060 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce664c0c8e6898acac67be7b9bb39623e803dbe937dd6be6364930c9f32308e5" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.576616 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ljt26" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.581700 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fch86" event={"ID":"54376bb6-4987-4ef6-8814-51fa5f95e7bb","Type":"ContainerDied","Data":"23e4a50af1603e639b20dd72232f60c8328d4fa771e6df0a51aab9420e067a4f"} Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.581770 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23e4a50af1603e639b20dd72232f60c8328d4fa771e6df0a51aab9420e067a4f" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.581842 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fch86" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.587377 4807 generic.go:334] "Generic (PLEG): container finished" podID="ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" containerID="af444d5027b97215d0398e19787af2420a5406dd8f3f33a00bd64ff2801c5918" exitCode=143 Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.587440 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62","Type":"ContainerDied","Data":"af444d5027b97215d0398e19787af2420a5406dd8f3f33a00bd64ff2801c5918"} Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.588663 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c1e20d0-372b-45be-9e7a-eed8a5264b08-operator-scripts\") pod \"7c1e20d0-372b-45be-9e7a-eed8a5264b08\" (UID: \"7c1e20d0-372b-45be-9e7a-eed8a5264b08\") " Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.588699 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fl7h\" (UniqueName: \"kubernetes.io/projected/04ab6b01-20b7-4320-aa91-2deecbdac66a-kube-api-access-5fl7h\") pod \"04ab6b01-20b7-4320-aa91-2deecbdac66a\" (UID: \"04ab6b01-20b7-4320-aa91-2deecbdac66a\") " Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.590888 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbqpc\" (UniqueName: \"kubernetes.io/projected/d0864d50-612d-47a3-bc0f-59883303dedf-kube-api-access-mbqpc\") pod \"d0864d50-612d-47a3-bc0f-59883303dedf\" (UID: \"d0864d50-612d-47a3-bc0f-59883303dedf\") " Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.591038 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq7b4\" (UniqueName: \"kubernetes.io/projected/ac62b3e0-8a0f-4953-b543-4f699b3bb7cb-kube-api-access-dq7b4\") pod \"ac62b3e0-8a0f-4953-b543-4f699b3bb7cb\" (UID: \"ac62b3e0-8a0f-4953-b543-4f699b3bb7cb\") " Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.591062 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac62b3e0-8a0f-4953-b543-4f699b3bb7cb-operator-scripts\") pod \"ac62b3e0-8a0f-4953-b543-4f699b3bb7cb\" (UID: \"ac62b3e0-8a0f-4953-b543-4f699b3bb7cb\") " Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.591136 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0864d50-612d-47a3-bc0f-59883303dedf-operator-scripts\") pod \"d0864d50-612d-47a3-bc0f-59883303dedf\" (UID: \"d0864d50-612d-47a3-bc0f-59883303dedf\") " Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.591177 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69xjd\" (UniqueName: \"kubernetes.io/projected/7c1e20d0-372b-45be-9e7a-eed8a5264b08-kube-api-access-69xjd\") pod \"7c1e20d0-372b-45be-9e7a-eed8a5264b08\" (UID: \"7c1e20d0-372b-45be-9e7a-eed8a5264b08\") " Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.591228 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04ab6b01-20b7-4320-aa91-2deecbdac66a-operator-scripts\") pod \"04ab6b01-20b7-4320-aa91-2deecbdac66a\" (UID: \"04ab6b01-20b7-4320-aa91-2deecbdac66a\") " Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.591757 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54376bb6-4987-4ef6-8814-51fa5f95e7bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.591769 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm7gg\" (UniqueName: \"kubernetes.io/projected/54376bb6-4987-4ef6-8814-51fa5f95e7bb-kube-api-access-hm7gg\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.594830 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0864d50-612d-47a3-bc0f-59883303dedf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0864d50-612d-47a3-bc0f-59883303dedf" (UID: "d0864d50-612d-47a3-bc0f-59883303dedf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.596214 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac62b3e0-8a0f-4953-b543-4f699b3bb7cb-kube-api-access-dq7b4" (OuterVolumeSpecName: "kube-api-access-dq7b4") pod "ac62b3e0-8a0f-4953-b543-4f699b3bb7cb" (UID: "ac62b3e0-8a0f-4953-b543-4f699b3bb7cb"). InnerVolumeSpecName "kube-api-access-dq7b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.596834 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c1e20d0-372b-45be-9e7a-eed8a5264b08-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c1e20d0-372b-45be-9e7a-eed8a5264b08" (UID: "7c1e20d0-372b-45be-9e7a-eed8a5264b08"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.598821 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04ab6b01-20b7-4320-aa91-2deecbdac66a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "04ab6b01-20b7-4320-aa91-2deecbdac66a" (UID: "04ab6b01-20b7-4320-aa91-2deecbdac66a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.602516 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-s4gh8" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.602505 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-s4gh8" event={"ID":"7c1e20d0-372b-45be-9e7a-eed8a5264b08","Type":"ContainerDied","Data":"10ef11c1e2b5baac0b840b88857cc680a434b7314192fe690e869a6802e937ab"} Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.602672 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10ef11c1e2b5baac0b840b88857cc680a434b7314192fe690e869a6802e937ab" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.603166 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0864d50-612d-47a3-bc0f-59883303dedf-kube-api-access-mbqpc" (OuterVolumeSpecName: "kube-api-access-mbqpc") pod "d0864d50-612d-47a3-bc0f-59883303dedf" (UID: "d0864d50-612d-47a3-bc0f-59883303dedf"). InnerVolumeSpecName "kube-api-access-mbqpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.604130 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ab6b01-20b7-4320-aa91-2deecbdac66a-kube-api-access-5fl7h" (OuterVolumeSpecName: "kube-api-access-5fl7h") pod "04ab6b01-20b7-4320-aa91-2deecbdac66a" (UID: "04ab6b01-20b7-4320-aa91-2deecbdac66a"). InnerVolumeSpecName "kube-api-access-5fl7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.605291 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac62b3e0-8a0f-4953-b543-4f699b3bb7cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac62b3e0-8a0f-4953-b543-4f699b3bb7cb" (UID: "ac62b3e0-8a0f-4953-b543-4f699b3bb7cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.606903 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4f38-account-create-update-wbhvj" event={"ID":"04ab6b01-20b7-4320-aa91-2deecbdac66a","Type":"ContainerDied","Data":"12ea2e61da72edd74bf5dd3b60f7e231f82b65975b86c076d53b68e9f98d7001"} Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.606959 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12ea2e61da72edd74bf5dd3b60f7e231f82b65975b86c076d53b68e9f98d7001" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.607020 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4f38-account-create-update-wbhvj" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.610625 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fe50-account-create-update-rpfxl" event={"ID":"d0864d50-612d-47a3-bc0f-59883303dedf","Type":"ContainerDied","Data":"7050e2d60ae8d9f1628f5e61a297e30e4855ff0ac3a80a08a62d4f6491853e63"} Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.610663 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7050e2d60ae8d9f1628f5e61a297e30e4855ff0ac3a80a08a62d4f6491853e63" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.610860 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fe50-account-create-update-rpfxl" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.613693 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-fb4f-account-create-update-rnlg5" event={"ID":"ac62b3e0-8a0f-4953-b543-4f699b3bb7cb","Type":"ContainerDied","Data":"18cd746fc5ad25e89ed524db0457ec7909df49dfeb7179a0a594ba9cae188ee5"} Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.613765 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18cd746fc5ad25e89ed524db0457ec7909df49dfeb7179a0a594ba9cae188ee5" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.613780 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-fb4f-account-create-update-rnlg5" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.624940 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c1e20d0-372b-45be-9e7a-eed8a5264b08-kube-api-access-69xjd" (OuterVolumeSpecName: "kube-api-access-69xjd") pod "7c1e20d0-372b-45be-9e7a-eed8a5264b08" (UID: "7c1e20d0-372b-45be-9e7a-eed8a5264b08"). InnerVolumeSpecName "kube-api-access-69xjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.698650 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbqpc\" (UniqueName: \"kubernetes.io/projected/d0864d50-612d-47a3-bc0f-59883303dedf-kube-api-access-mbqpc\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.698694 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq7b4\" (UniqueName: \"kubernetes.io/projected/ac62b3e0-8a0f-4953-b543-4f699b3bb7cb-kube-api-access-dq7b4\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.698708 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac62b3e0-8a0f-4953-b543-4f699b3bb7cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.698737 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0864d50-612d-47a3-bc0f-59883303dedf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.698755 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69xjd\" (UniqueName: \"kubernetes.io/projected/7c1e20d0-372b-45be-9e7a-eed8a5264b08-kube-api-access-69xjd\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.698768 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04ab6b01-20b7-4320-aa91-2deecbdac66a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.698778 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c1e20d0-372b-45be-9e7a-eed8a5264b08-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:26 crc kubenswrapper[4807]: I1202 20:22:26.698790 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fl7h\" (UniqueName: \"kubernetes.io/projected/04ab6b01-20b7-4320-aa91-2deecbdac66a-kube-api-access-5fl7h\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:27 crc kubenswrapper[4807]: I1202 20:22:27.638028 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66","Type":"ContainerStarted","Data":"dc6133c01f5d78484890fd8acb046b9029b28806b82f86cd99234db1dbb5c1a6"} Dec 02 20:22:27 crc kubenswrapper[4807]: I1202 20:22:27.638334 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 20:22:27 crc kubenswrapper[4807]: I1202 20:22:27.638166 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerName="ceilometer-central-agent" containerID="cri-o://a91b7221a9ef17496b2831a07e93cbc306113c6716495c912d3c40bdeb3437f5" gracePeriod=30 Dec 02 20:22:27 crc kubenswrapper[4807]: I1202 20:22:27.638434 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerName="ceilometer-notification-agent" containerID="cri-o://88f09b99835ddf7ee29bc4a6b2c519c183c21b3c370aca7e13b229ebf50952a1" gracePeriod=30 Dec 02 20:22:27 crc kubenswrapper[4807]: I1202 20:22:27.638500 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerName="proxy-httpd" containerID="cri-o://dc6133c01f5d78484890fd8acb046b9029b28806b82f86cd99234db1dbb5c1a6" gracePeriod=30 Dec 02 20:22:27 crc kubenswrapper[4807]: I1202 20:22:27.639366 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerName="sg-core" containerID="cri-o://f0fc967027c35c9af4d717f0647556452e3f07dadb2522d9edd32e6232dd7505" gracePeriod=30 Dec 02 20:22:27 crc kubenswrapper[4807]: I1202 20:22:27.668167 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.666854909 podStartE2EDuration="7.668150342s" podCreationTimestamp="2025-12-02 20:22:20 +0000 UTC" firstStartedPulling="2025-12-02 20:22:21.694943348 +0000 UTC m=+1476.995850843" lastFinishedPulling="2025-12-02 20:22:26.696238781 +0000 UTC m=+1481.997146276" observedRunningTime="2025-12-02 20:22:27.664088152 +0000 UTC m=+1482.964995647" watchObservedRunningTime="2025-12-02 20:22:27.668150342 +0000 UTC m=+1482.969057837" Dec 02 20:22:27 crc kubenswrapper[4807]: I1202 20:22:27.885883 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="88b698ce-a6a9-4607-8629-394e3a2d7d29" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.164:9292/healthcheck\": read tcp 10.217.0.2:52704->10.217.0.164:9292: read: connection reset by peer" Dec 02 20:22:27 crc kubenswrapper[4807]: I1202 20:22:27.885898 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="88b698ce-a6a9-4607-8629-394e3a2d7d29" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.164:9292/healthcheck\": read tcp 10.217.0.2:52690->10.217.0.164:9292: read: connection reset by peer" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.655816 4807 generic.go:334] "Generic (PLEG): container finished" podID="88b698ce-a6a9-4607-8629-394e3a2d7d29" containerID="5923f4a532f99d9911076926bb80bbf7974024d3d16e2323a9861629b0725eac" exitCode=0 Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.655895 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"88b698ce-a6a9-4607-8629-394e3a2d7d29","Type":"ContainerDied","Data":"5923f4a532f99d9911076926bb80bbf7974024d3d16e2323a9861629b0725eac"} Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.665274 4807 generic.go:334] "Generic (PLEG): container finished" podID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerID="dc6133c01f5d78484890fd8acb046b9029b28806b82f86cd99234db1dbb5c1a6" exitCode=0 Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.665323 4807 generic.go:334] "Generic (PLEG): container finished" podID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerID="f0fc967027c35c9af4d717f0647556452e3f07dadb2522d9edd32e6232dd7505" exitCode=2 Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.665335 4807 generic.go:334] "Generic (PLEG): container finished" podID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerID="88f09b99835ddf7ee29bc4a6b2c519c183c21b3c370aca7e13b229ebf50952a1" exitCode=0 Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.665348 4807 generic.go:334] "Generic (PLEG): container finished" podID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerID="a91b7221a9ef17496b2831a07e93cbc306113c6716495c912d3c40bdeb3437f5" exitCode=0 Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.665357 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66","Type":"ContainerDied","Data":"dc6133c01f5d78484890fd8acb046b9029b28806b82f86cd99234db1dbb5c1a6"} Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.665410 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66","Type":"ContainerDied","Data":"f0fc967027c35c9af4d717f0647556452e3f07dadb2522d9edd32e6232dd7505"} Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.665426 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66","Type":"ContainerDied","Data":"88f09b99835ddf7ee29bc4a6b2c519c183c21b3c370aca7e13b229ebf50952a1"} Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.665438 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66","Type":"ContainerDied","Data":"a91b7221a9ef17496b2831a07e93cbc306113c6716495c912d3c40bdeb3437f5"} Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.665451 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66","Type":"ContainerDied","Data":"452ec1b8016f9165eee82e3b2508137100dba9ed8833f67dcccb8beefc41a7d9"} Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.665465 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="452ec1b8016f9165eee82e3b2508137100dba9ed8833f67dcccb8beefc41a7d9" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.729944 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.745089 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.854774 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-config-data\") pod \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.854825 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-scripts\") pod \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.854880 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-combined-ca-bundle\") pod \"88b698ce-a6a9-4607-8629-394e3a2d7d29\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.854924 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-run-httpd\") pod \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.854974 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-scripts\") pod \"88b698ce-a6a9-4607-8629-394e3a2d7d29\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.855022 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b698ce-a6a9-4607-8629-394e3a2d7d29-logs\") pod \"88b698ce-a6a9-4607-8629-394e3a2d7d29\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.855052 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"88b698ce-a6a9-4607-8629-394e3a2d7d29\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.855072 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-sg-core-conf-yaml\") pod \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.855184 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-log-httpd\") pod \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.855232 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-combined-ca-bundle\") pod \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.855302 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88b698ce-a6a9-4607-8629-394e3a2d7d29-httpd-run\") pod \"88b698ce-a6a9-4607-8629-394e3a2d7d29\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.855392 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttspj\" (UniqueName: \"kubernetes.io/projected/88b698ce-a6a9-4607-8629-394e3a2d7d29-kube-api-access-ttspj\") pod \"88b698ce-a6a9-4607-8629-394e3a2d7d29\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.855421 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-public-tls-certs\") pod \"88b698ce-a6a9-4607-8629-394e3a2d7d29\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.855460 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxkds\" (UniqueName: \"kubernetes.io/projected/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-kube-api-access-gxkds\") pod \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\" (UID: \"031b4ae2-ce6b-44f2-87d0-c9b3de1aae66\") " Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.855488 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-config-data\") pod \"88b698ce-a6a9-4607-8629-394e3a2d7d29\" (UID: \"88b698ce-a6a9-4607-8629-394e3a2d7d29\") " Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.857915 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.165:9292/healthcheck\": read tcp 10.217.0.2:40228->10.217.0.165:9292: read: connection reset by peer" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.858386 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.165:9292/healthcheck\": read tcp 10.217.0.2:40242->10.217.0.165:9292: read: connection reset by peer" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.858999 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b698ce-a6a9-4607-8629-394e3a2d7d29-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "88b698ce-a6a9-4607-8629-394e3a2d7d29" (UID: "88b698ce-a6a9-4607-8629-394e3a2d7d29"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.859025 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" (UID: "031b4ae2-ce6b-44f2-87d0-c9b3de1aae66"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.866913 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b698ce-a6a9-4607-8629-394e3a2d7d29-logs" (OuterVolumeSpecName: "logs") pod "88b698ce-a6a9-4607-8629-394e3a2d7d29" (UID: "88b698ce-a6a9-4607-8629-394e3a2d7d29"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.881046 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-kube-api-access-gxkds" (OuterVolumeSpecName: "kube-api-access-gxkds") pod "031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" (UID: "031b4ae2-ce6b-44f2-87d0-c9b3de1aae66"). InnerVolumeSpecName "kube-api-access-gxkds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.882829 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" (UID: "031b4ae2-ce6b-44f2-87d0-c9b3de1aae66"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.888930 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "88b698ce-a6a9-4607-8629-394e3a2d7d29" (UID: "88b698ce-a6a9-4607-8629-394e3a2d7d29"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.898630 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-scripts" (OuterVolumeSpecName: "scripts") pod "031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" (UID: "031b4ae2-ce6b-44f2-87d0-c9b3de1aae66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.906041 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88b698ce-a6a9-4607-8629-394e3a2d7d29-kube-api-access-ttspj" (OuterVolumeSpecName: "kube-api-access-ttspj") pod "88b698ce-a6a9-4607-8629-394e3a2d7d29" (UID: "88b698ce-a6a9-4607-8629-394e3a2d7d29"). InnerVolumeSpecName "kube-api-access-ttspj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.956959 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-scripts" (OuterVolumeSpecName: "scripts") pod "88b698ce-a6a9-4607-8629-394e3a2d7d29" (UID: "88b698ce-a6a9-4607-8629-394e3a2d7d29"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.959206 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttspj\" (UniqueName: \"kubernetes.io/projected/88b698ce-a6a9-4607-8629-394e3a2d7d29-kube-api-access-ttspj\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.959241 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxkds\" (UniqueName: \"kubernetes.io/projected/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-kube-api-access-gxkds\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.959253 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.959265 4807 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.959276 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.959288 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b698ce-a6a9-4607-8629-394e3a2d7d29-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.959320 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.959334 4807 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:28 crc kubenswrapper[4807]: I1202 20:22:28.959349 4807 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/88b698ce-a6a9-4607-8629-394e3a2d7d29-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.005988 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88b698ce-a6a9-4607-8629-394e3a2d7d29" (UID: "88b698ce-a6a9-4607-8629-394e3a2d7d29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.006007 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" (UID: "031b4ae2-ce6b-44f2-87d0-c9b3de1aae66"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.063043 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.063102 4807 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.131231 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-config-data" (OuterVolumeSpecName: "config-data") pod "88b698ce-a6a9-4607-8629-394e3a2d7d29" (UID: "88b698ce-a6a9-4607-8629-394e3a2d7d29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.139352 4807 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.171477 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.171522 4807 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.182462 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "88b698ce-a6a9-4607-8629-394e3a2d7d29" (UID: "88b698ce-a6a9-4607-8629-394e3a2d7d29"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.192409 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" (UID: "031b4ae2-ce6b-44f2-87d0-c9b3de1aae66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.227295 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-config-data" (OuterVolumeSpecName: "config-data") pod "031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" (UID: "031b4ae2-ce6b-44f2-87d0-c9b3de1aae66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.273626 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.274030 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.274168 4807 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88b698ce-a6a9-4607-8629-394e3a2d7d29-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.684801 4807 generic.go:334] "Generic (PLEG): container finished" podID="ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" containerID="a1234d2f9bbea41e51f17c78fe72a9d053c485e8d7dd5e87d80b7a256d014711" exitCode=0 Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.684886 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62","Type":"ContainerDied","Data":"a1234d2f9bbea41e51f17c78fe72a9d053c485e8d7dd5e87d80b7a256d014711"} Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.701624 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.702870 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"88b698ce-a6a9-4607-8629-394e3a2d7d29","Type":"ContainerDied","Data":"8a2e324022adecb3f865ca1d1068e038c33dd749abc8ee5073fd03b44e088596"} Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.702925 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.702959 4807 scope.go:117] "RemoveContainer" containerID="5923f4a532f99d9911076926bb80bbf7974024d3d16e2323a9861629b0725eac" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.764619 4807 scope.go:117] "RemoveContainer" containerID="5da3672237909f2b76e84812636ed8d7d8ec3f5487f44ca0397dbb4ca09ef4da" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.776410 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.811495 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.838452 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.884871 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.930170 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 20:22:29 crc kubenswrapper[4807]: E1202 20:22:29.931577 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54376bb6-4987-4ef6-8814-51fa5f95e7bb" containerName="mariadb-database-create" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.931609 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="54376bb6-4987-4ef6-8814-51fa5f95e7bb" containerName="mariadb-database-create" Dec 02 20:22:29 crc kubenswrapper[4807]: E1202 20:22:29.931623 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac62b3e0-8a0f-4953-b543-4f699b3bb7cb" containerName="mariadb-account-create-update" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.931632 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac62b3e0-8a0f-4953-b543-4f699b3bb7cb" containerName="mariadb-account-create-update" Dec 02 20:22:29 crc kubenswrapper[4807]: E1202 20:22:29.931648 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b698ce-a6a9-4607-8629-394e3a2d7d29" containerName="glance-httpd" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.931656 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b698ce-a6a9-4607-8629-394e3a2d7d29" containerName="glance-httpd" Dec 02 20:22:29 crc kubenswrapper[4807]: E1202 20:22:29.931667 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ab6b01-20b7-4320-aa91-2deecbdac66a" containerName="mariadb-account-create-update" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.931676 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ab6b01-20b7-4320-aa91-2deecbdac66a" containerName="mariadb-account-create-update" Dec 02 20:22:29 crc kubenswrapper[4807]: E1202 20:22:29.931689 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b698ce-a6a9-4607-8629-394e3a2d7d29" containerName="glance-log" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.931697 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b698ce-a6a9-4607-8629-394e3a2d7d29" containerName="glance-log" Dec 02 20:22:29 crc kubenswrapper[4807]: E1202 20:22:29.931731 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c1e20d0-372b-45be-9e7a-eed8a5264b08" containerName="mariadb-database-create" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.931742 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c1e20d0-372b-45be-9e7a-eed8a5264b08" containerName="mariadb-database-create" Dec 02 20:22:29 crc kubenswrapper[4807]: E1202 20:22:29.931762 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerName="ceilometer-notification-agent" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.931770 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerName="ceilometer-notification-agent" Dec 02 20:22:29 crc kubenswrapper[4807]: E1202 20:22:29.931785 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerName="proxy-httpd" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.931793 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerName="proxy-httpd" Dec 02 20:22:29 crc kubenswrapper[4807]: E1202 20:22:29.931820 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0864d50-612d-47a3-bc0f-59883303dedf" containerName="mariadb-account-create-update" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.931828 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0864d50-612d-47a3-bc0f-59883303dedf" containerName="mariadb-account-create-update" Dec 02 20:22:29 crc kubenswrapper[4807]: E1202 20:22:29.931854 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bde51b6-13ef-4b4b-b126-b3effc7dc8ea" containerName="mariadb-database-create" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.931861 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bde51b6-13ef-4b4b-b126-b3effc7dc8ea" containerName="mariadb-database-create" Dec 02 20:22:29 crc kubenswrapper[4807]: E1202 20:22:29.931874 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerName="sg-core" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.931882 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerName="sg-core" Dec 02 20:22:29 crc kubenswrapper[4807]: E1202 20:22:29.931897 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerName="ceilometer-central-agent" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.931906 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerName="ceilometer-central-agent" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.933750 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="54376bb6-4987-4ef6-8814-51fa5f95e7bb" containerName="mariadb-database-create" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.933779 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerName="sg-core" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.933790 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerName="ceilometer-central-agent" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.933808 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="88b698ce-a6a9-4607-8629-394e3a2d7d29" containerName="glance-log" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.933823 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="88b698ce-a6a9-4607-8629-394e3a2d7d29" containerName="glance-httpd" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.933838 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0864d50-612d-47a3-bc0f-59883303dedf" containerName="mariadb-account-create-update" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.933854 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ab6b01-20b7-4320-aa91-2deecbdac66a" containerName="mariadb-account-create-update" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.933872 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerName="proxy-httpd" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.933882 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac62b3e0-8a0f-4953-b543-4f699b3bb7cb" containerName="mariadb-account-create-update" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.933903 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bde51b6-13ef-4b4b-b126-b3effc7dc8ea" containerName="mariadb-database-create" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.933914 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" containerName="ceilometer-notification-agent" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.933926 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c1e20d0-372b-45be-9e7a-eed8a5264b08" containerName="mariadb-database-create" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.935547 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.937695 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.942112 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.947587 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.969512 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.972614 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.976185 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.978326 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 20:22:29 crc kubenswrapper[4807]: I1202 20:22:29.999699 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.040473 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.040579 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-config-data\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.040633 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-scripts\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.040661 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-log-httpd\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.040679 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt8ph\" (UniqueName: \"kubernetes.io/projected/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-kube-api-access-lt8ph\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.040749 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-logs\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.040814 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.040919 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.040939 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf55t\" (UniqueName: \"kubernetes.io/projected/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-kube-api-access-pf55t\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.041038 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.041091 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-config-data\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.041179 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-scripts\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.041296 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.041432 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-run-httpd\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.041477 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.144694 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.144806 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-config-data\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.144866 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-scripts\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.144908 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-log-httpd\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.144950 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt8ph\" (UniqueName: \"kubernetes.io/projected/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-kube-api-access-lt8ph\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.145009 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-logs\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.145056 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.145098 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.145122 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf55t\" (UniqueName: \"kubernetes.io/projected/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-kube-api-access-pf55t\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.145176 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.145233 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-config-data\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.145287 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-scripts\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.145345 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.145417 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-run-httpd\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.145447 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.146199 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-logs\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.146687 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-log-httpd\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.147419 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.147593 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.153533 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-config-data\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.153991 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-run-httpd\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.156358 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.157537 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.157978 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-scripts\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.158352 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.164767 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-scripts\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.166880 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-config-data\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.176528 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf55t\" (UniqueName: \"kubernetes.io/projected/72b4a2ac-3f2c-4064-bcd9-b40585699ab9-kube-api-access-pf55t\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.183488 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt8ph\" (UniqueName: \"kubernetes.io/projected/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-kube-api-access-lt8ph\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.199512 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"72b4a2ac-3f2c-4064-bcd9-b40585699ab9\") " pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.205338 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.268023 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.581226 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.748048 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.760736 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62","Type":"ContainerDied","Data":"6a5c6268c2ed72948f67dfc5e85bcfe50696e07d236efc57ddb6c07a6e20e3e5"} Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.760793 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a5c6268c2ed72948f67dfc5e85bcfe50696e07d236efc57ddb6c07a6e20e3e5" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.790192 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.799790 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.995655 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-combined-ca-bundle\") pod \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.995746 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.995886 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-httpd-run\") pod \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.996003 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-internal-tls-certs\") pod \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.996114 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt6nq\" (UniqueName: \"kubernetes.io/projected/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-kube-api-access-jt6nq\") pod \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.996157 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-config-data\") pod \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.996300 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-logs\") pod \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.996360 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-scripts\") pod \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\" (UID: \"ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62\") " Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.996871 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" (UID: "ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.997418 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-logs" (OuterVolumeSpecName: "logs") pod "ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" (UID: "ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:22:30 crc kubenswrapper[4807]: I1202 20:22:30.997474 4807 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.005589 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-kube-api-access-jt6nq" (OuterVolumeSpecName: "kube-api-access-jt6nq") pod "ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" (UID: "ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62"). InnerVolumeSpecName "kube-api-access-jt6nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.006210 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-scripts" (OuterVolumeSpecName: "scripts") pod "ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" (UID: "ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.008310 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" (UID: "ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.022310 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="031b4ae2-ce6b-44f2-87d0-c9b3de1aae66" path="/var/lib/kubelet/pods/031b4ae2-ce6b-44f2-87d0-c9b3de1aae66/volumes" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.023216 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88b698ce-a6a9-4607-8629-394e3a2d7d29" path="/var/lib/kubelet/pods/88b698ce-a6a9-4607-8629-394e3a2d7d29/volumes" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.060413 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" (UID: "ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.115068 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-config-data" (OuterVolumeSpecName: "config-data") pod "ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" (UID: "ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.118658 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.118709 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.118743 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.118788 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.118803 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt6nq\" (UniqueName: \"kubernetes.io/projected/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-kube-api-access-jt6nq\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.118815 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.121553 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" (UID: "ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.178331 4807 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.222999 4807 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.226280 4807 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.264980 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.442355 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.787185 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b","Type":"ContainerStarted","Data":"496d0ace14e4c03b8068781caf15acf4a6be124847d93cfe74a62ea9542361fa"} Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.791198 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"72b4a2ac-3f2c-4064-bcd9-b40585699ab9","Type":"ContainerStarted","Data":"d6bda2e79616c95505a87b712a8c72fbff2657d246ec4a4b8caf9929abb13a35"} Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.791418 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.793590 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.850704 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.858325 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.889751 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.910840 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.937617 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 20:22:31 crc kubenswrapper[4807]: E1202 20:22:31.938226 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" containerName="glance-httpd" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.938245 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" containerName="glance-httpd" Dec 02 20:22:31 crc kubenswrapper[4807]: E1202 20:22:31.938260 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" containerName="glance-log" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.938268 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" containerName="glance-log" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.938510 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" containerName="glance-log" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.938539 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" containerName="glance-httpd" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.944006 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.972623 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.979603 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 20:22:31 crc kubenswrapper[4807]: I1202 20:22:31.979884 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.078267 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.078342 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.078423 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.078503 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-logs\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.078622 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg7vc\" (UniqueName: \"kubernetes.io/projected/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-kube-api-access-gg7vc\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.078668 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.078866 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.078905 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.186768 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.187364 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-logs\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.187484 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg7vc\" (UniqueName: \"kubernetes.io/projected/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-kube-api-access-gg7vc\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.187533 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.192177 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.192921 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.192999 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.193084 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.193125 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.198389 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.198837 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-logs\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.205557 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.205845 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.210302 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.213363 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.234382 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg7vc\" (UniqueName: \"kubernetes.io/projected/7b4380db-c1e5-4f3b-81f6-ae5a6d71119a-kube-api-access-gg7vc\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.264778 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a\") " pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.299446 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hrk4p"] Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.303432 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hrk4p" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.305303 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.308541 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.308825 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-68rp7" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.308981 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.343883 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hrk4p"] Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.397386 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-config-data\") pod \"nova-cell0-conductor-db-sync-hrk4p\" (UID: \"76d53985-4c04-4e9c-be07-b866ac014640\") " pod="openstack/nova-cell0-conductor-db-sync-hrk4p" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.397465 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc92b\" (UniqueName: \"kubernetes.io/projected/76d53985-4c04-4e9c-be07-b866ac014640-kube-api-access-hc92b\") pod \"nova-cell0-conductor-db-sync-hrk4p\" (UID: \"76d53985-4c04-4e9c-be07-b866ac014640\") " pod="openstack/nova-cell0-conductor-db-sync-hrk4p" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.397523 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-scripts\") pod \"nova-cell0-conductor-db-sync-hrk4p\" (UID: \"76d53985-4c04-4e9c-be07-b866ac014640\") " pod="openstack/nova-cell0-conductor-db-sync-hrk4p" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.397631 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hrk4p\" (UID: \"76d53985-4c04-4e9c-be07-b866ac014640\") " pod="openstack/nova-cell0-conductor-db-sync-hrk4p" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.527229 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc92b\" (UniqueName: \"kubernetes.io/projected/76d53985-4c04-4e9c-be07-b866ac014640-kube-api-access-hc92b\") pod \"nova-cell0-conductor-db-sync-hrk4p\" (UID: \"76d53985-4c04-4e9c-be07-b866ac014640\") " pod="openstack/nova-cell0-conductor-db-sync-hrk4p" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.528187 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-scripts\") pod \"nova-cell0-conductor-db-sync-hrk4p\" (UID: \"76d53985-4c04-4e9c-be07-b866ac014640\") " pod="openstack/nova-cell0-conductor-db-sync-hrk4p" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.528838 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hrk4p\" (UID: \"76d53985-4c04-4e9c-be07-b866ac014640\") " pod="openstack/nova-cell0-conductor-db-sync-hrk4p" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.529111 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-config-data\") pod \"nova-cell0-conductor-db-sync-hrk4p\" (UID: \"76d53985-4c04-4e9c-be07-b866ac014640\") " pod="openstack/nova-cell0-conductor-db-sync-hrk4p" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.534859 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-config-data\") pod \"nova-cell0-conductor-db-sync-hrk4p\" (UID: \"76d53985-4c04-4e9c-be07-b866ac014640\") " pod="openstack/nova-cell0-conductor-db-sync-hrk4p" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.535896 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hrk4p\" (UID: \"76d53985-4c04-4e9c-be07-b866ac014640\") " pod="openstack/nova-cell0-conductor-db-sync-hrk4p" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.540662 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-scripts\") pod \"nova-cell0-conductor-db-sync-hrk4p\" (UID: \"76d53985-4c04-4e9c-be07-b866ac014640\") " pod="openstack/nova-cell0-conductor-db-sync-hrk4p" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.552449 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc92b\" (UniqueName: \"kubernetes.io/projected/76d53985-4c04-4e9c-be07-b866ac014640-kube-api-access-hc92b\") pod \"nova-cell0-conductor-db-sync-hrk4p\" (UID: \"76d53985-4c04-4e9c-be07-b866ac014640\") " pod="openstack/nova-cell0-conductor-db-sync-hrk4p" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.559313 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hrk4p" Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.832827 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b","Type":"ContainerStarted","Data":"e9c8f133b651da3a5e86290625a32654018cfbf1e01d9cd3c69dad1b0a5b7f8c"} Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.844148 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"72b4a2ac-3f2c-4064-bcd9-b40585699ab9","Type":"ContainerStarted","Data":"e301dddb7fc77c33ac67f8e0679c7112b363282948a7f89e8c066e62b40b6486"} Dec 02 20:22:32 crc kubenswrapper[4807]: I1202 20:22:32.869129 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 20:22:32 crc kubenswrapper[4807]: W1202 20:22:32.911520 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b4380db_c1e5_4f3b_81f6_ae5a6d71119a.slice/crio-12264be2fdd74814ddd19b57f44803854c32432dc8e889ba59c2fc82c31ba858 WatchSource:0}: Error finding container 12264be2fdd74814ddd19b57f44803854c32432dc8e889ba59c2fc82c31ba858: Status 404 returned error can't find the container with id 12264be2fdd74814ddd19b57f44803854c32432dc8e889ba59c2fc82c31ba858 Dec 02 20:22:33 crc kubenswrapper[4807]: I1202 20:22:33.018012 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62" path="/var/lib/kubelet/pods/ba555dc5-d9d8-45fc-9c5e-84d64e7d5a62/volumes" Dec 02 20:22:33 crc kubenswrapper[4807]: I1202 20:22:33.133458 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hrk4p"] Dec 02 20:22:33 crc kubenswrapper[4807]: I1202 20:22:33.871935 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b","Type":"ContainerStarted","Data":"5d71ed41c67e0ed0ac561b6a856bf866a02ddad283f9c004ba2b6c5703768ec1"} Dec 02 20:22:33 crc kubenswrapper[4807]: I1202 20:22:33.880868 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a","Type":"ContainerStarted","Data":"a7490f1430f3325be7e67539ac774e958af7060e0163725e6c838946cc0cb789"} Dec 02 20:22:33 crc kubenswrapper[4807]: I1202 20:22:33.880919 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a","Type":"ContainerStarted","Data":"12264be2fdd74814ddd19b57f44803854c32432dc8e889ba59c2fc82c31ba858"} Dec 02 20:22:33 crc kubenswrapper[4807]: I1202 20:22:33.898224 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"72b4a2ac-3f2c-4064-bcd9-b40585699ab9","Type":"ContainerStarted","Data":"7f94eb52060cf5384f140a72e2f56f4ca625a2b606c2db72f85bb62aa5ff7a1f"} Dec 02 20:22:33 crc kubenswrapper[4807]: I1202 20:22:33.909596 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hrk4p" event={"ID":"76d53985-4c04-4e9c-be07-b866ac014640","Type":"ContainerStarted","Data":"614671a58315301b18e664e8bcf25c87641cfc4870f1c901952bb1f3038e45da"} Dec 02 20:22:33 crc kubenswrapper[4807]: I1202 20:22:33.936956 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.936930363 podStartE2EDuration="4.936930363s" podCreationTimestamp="2025-12-02 20:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:22:33.929011599 +0000 UTC m=+1489.229919094" watchObservedRunningTime="2025-12-02 20:22:33.936930363 +0000 UTC m=+1489.237837858" Dec 02 20:22:34 crc kubenswrapper[4807]: I1202 20:22:34.927260 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b4380db-c1e5-4f3b-81f6-ae5a6d71119a","Type":"ContainerStarted","Data":"7f91b285a78a1e926249d0840f40a10ae462d119b9591cc292c46ba28e4d06e9"} Dec 02 20:22:34 crc kubenswrapper[4807]: I1202 20:22:34.940001 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b","Type":"ContainerStarted","Data":"a5993e5be6b251561dfb29bbd31d235e9f97aa5e214cf33897dd70450d334ceb"} Dec 02 20:22:34 crc kubenswrapper[4807]: I1202 20:22:34.957130 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.95710476 podStartE2EDuration="3.95710476s" podCreationTimestamp="2025-12-02 20:22:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:22:34.955176782 +0000 UTC m=+1490.256084277" watchObservedRunningTime="2025-12-02 20:22:34.95710476 +0000 UTC m=+1490.258012255" Dec 02 20:22:36 crc kubenswrapper[4807]: I1202 20:22:36.971287 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b","Type":"ContainerStarted","Data":"209e9c29c6bc36bfec8cc3c4d028dde495ae8fb0a350449a9badd2b82a67d037"} Dec 02 20:22:36 crc kubenswrapper[4807]: I1202 20:22:36.973940 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 20:22:37 crc kubenswrapper[4807]: I1202 20:22:37.005224 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.486982603 podStartE2EDuration="8.005203783s" podCreationTimestamp="2025-12-02 20:22:29 +0000 UTC" firstStartedPulling="2025-12-02 20:22:31.301350933 +0000 UTC m=+1486.602258428" lastFinishedPulling="2025-12-02 20:22:35.819572113 +0000 UTC m=+1491.120479608" observedRunningTime="2025-12-02 20:22:37.000512684 +0000 UTC m=+1492.301420189" watchObservedRunningTime="2025-12-02 20:22:37.005203783 +0000 UTC m=+1492.306111278" Dec 02 20:22:39 crc kubenswrapper[4807]: I1202 20:22:39.004706 4807 generic.go:334] "Generic (PLEG): container finished" podID="62551774-7dc7-4727-a79d-f92d4f82d560" containerID="6140a25963321ba78bfcc620d218555820f13265bded3d755ce761deefdb6ef7" exitCode=137 Dec 02 20:22:39 crc kubenswrapper[4807]: I1202 20:22:39.005110 4807 generic.go:334] "Generic (PLEG): container finished" podID="62551774-7dc7-4727-a79d-f92d4f82d560" containerID="acc5eee2fa3e2e060eea697bd28253234491412bf2a4bb3353d3abe041d69fa2" exitCode=137 Dec 02 20:22:39 crc kubenswrapper[4807]: I1202 20:22:39.006898 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc8fc8c44-b8pmd" event={"ID":"62551774-7dc7-4727-a79d-f92d4f82d560","Type":"ContainerDied","Data":"6140a25963321ba78bfcc620d218555820f13265bded3d755ce761deefdb6ef7"} Dec 02 20:22:39 crc kubenswrapper[4807]: I1202 20:22:39.006941 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc8fc8c44-b8pmd" event={"ID":"62551774-7dc7-4727-a79d-f92d4f82d560","Type":"ContainerDied","Data":"acc5eee2fa3e2e060eea697bd28253234491412bf2a4bb3353d3abe041d69fa2"} Dec 02 20:22:39 crc kubenswrapper[4807]: I1202 20:22:39.006989 4807 scope.go:117] "RemoveContainer" containerID="1a2bc25490b727e95a676dc6e738dd9535b3f9346206cb4c536bfdd80fd32d19" Dec 02 20:22:40 crc kubenswrapper[4807]: I1202 20:22:40.269676 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 20:22:40 crc kubenswrapper[4807]: I1202 20:22:40.269736 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 20:22:40 crc kubenswrapper[4807]: I1202 20:22:40.317735 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 20:22:40 crc kubenswrapper[4807]: I1202 20:22:40.322081 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 20:22:41 crc kubenswrapper[4807]: I1202 20:22:41.039784 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 20:22:41 crc kubenswrapper[4807]: I1202 20:22:41.040121 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.231674 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.307127 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.307204 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.339956 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62551774-7dc7-4727-a79d-f92d4f82d560-config-data\") pod \"62551774-7dc7-4727-a79d-f92d4f82d560\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.340502 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62551774-7dc7-4727-a79d-f92d4f82d560-scripts\") pod \"62551774-7dc7-4727-a79d-f92d4f82d560\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.340624 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-horizon-tls-certs\") pod \"62551774-7dc7-4727-a79d-f92d4f82d560\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.340802 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk96s\" (UniqueName: \"kubernetes.io/projected/62551774-7dc7-4727-a79d-f92d4f82d560-kube-api-access-gk96s\") pod \"62551774-7dc7-4727-a79d-f92d4f82d560\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.340868 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62551774-7dc7-4727-a79d-f92d4f82d560-logs\") pod \"62551774-7dc7-4727-a79d-f92d4f82d560\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.340900 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-horizon-secret-key\") pod \"62551774-7dc7-4727-a79d-f92d4f82d560\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.340993 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-combined-ca-bundle\") pod \"62551774-7dc7-4727-a79d-f92d4f82d560\" (UID: \"62551774-7dc7-4727-a79d-f92d4f82d560\") " Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.341969 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62551774-7dc7-4727-a79d-f92d4f82d560-logs" (OuterVolumeSpecName: "logs") pod "62551774-7dc7-4727-a79d-f92d4f82d560" (UID: "62551774-7dc7-4727-a79d-f92d4f82d560"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.348402 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "62551774-7dc7-4727-a79d-f92d4f82d560" (UID: "62551774-7dc7-4727-a79d-f92d4f82d560"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.353087 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62551774-7dc7-4727-a79d-f92d4f82d560-kube-api-access-gk96s" (OuterVolumeSpecName: "kube-api-access-gk96s") pod "62551774-7dc7-4727-a79d-f92d4f82d560" (UID: "62551774-7dc7-4727-a79d-f92d4f82d560"). InnerVolumeSpecName "kube-api-access-gk96s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.377998 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62551774-7dc7-4727-a79d-f92d4f82d560-scripts" (OuterVolumeSpecName: "scripts") pod "62551774-7dc7-4727-a79d-f92d4f82d560" (UID: "62551774-7dc7-4727-a79d-f92d4f82d560"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.387399 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62551774-7dc7-4727-a79d-f92d4f82d560" (UID: "62551774-7dc7-4727-a79d-f92d4f82d560"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.396389 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.396624 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.415695 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62551774-7dc7-4727-a79d-f92d4f82d560-config-data" (OuterVolumeSpecName: "config-data") pod "62551774-7dc7-4727-a79d-f92d4f82d560" (UID: "62551774-7dc7-4727-a79d-f92d4f82d560"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.439857 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "62551774-7dc7-4727-a79d-f92d4f82d560" (UID: "62551774-7dc7-4727-a79d-f92d4f82d560"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.448608 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.449445 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/62551774-7dc7-4727-a79d-f92d4f82d560-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.450963 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/62551774-7dc7-4727-a79d-f92d4f82d560-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.451087 4807 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.451179 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk96s\" (UniqueName: \"kubernetes.io/projected/62551774-7dc7-4727-a79d-f92d4f82d560-kube-api-access-gk96s\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.451262 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62551774-7dc7-4727-a79d-f92d4f82d560-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:42 crc kubenswrapper[4807]: I1202 20:22:42.451347 4807 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/62551774-7dc7-4727-a79d-f92d4f82d560-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:43 crc kubenswrapper[4807]: I1202 20:22:43.084968 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cc8fc8c44-b8pmd" Dec 02 20:22:43 crc kubenswrapper[4807]: I1202 20:22:43.084952 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cc8fc8c44-b8pmd" event={"ID":"62551774-7dc7-4727-a79d-f92d4f82d560","Type":"ContainerDied","Data":"ef4c3926c332bcd00dcf971b4776055b08fc6d5c025e9fcea4405cf8b54d214f"} Dec 02 20:22:43 crc kubenswrapper[4807]: I1202 20:22:43.086310 4807 scope.go:117] "RemoveContainer" containerID="6140a25963321ba78bfcc620d218555820f13265bded3d755ce761deefdb6ef7" Dec 02 20:22:43 crc kubenswrapper[4807]: I1202 20:22:43.091817 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hrk4p" event={"ID":"76d53985-4c04-4e9c-be07-b866ac014640","Type":"ContainerStarted","Data":"1a7ebd77d06c82058f56bc01c466308e693f119872c024b5d0f18cab460cfa11"} Dec 02 20:22:43 crc kubenswrapper[4807]: I1202 20:22:43.091899 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 20:22:43 crc kubenswrapper[4807]: I1202 20:22:43.091927 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 20:22:43 crc kubenswrapper[4807]: I1202 20:22:43.147564 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 20:22:43 crc kubenswrapper[4807]: I1202 20:22:43.147681 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 20:22:43 crc kubenswrapper[4807]: I1202 20:22:43.148881 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 20:22:43 crc kubenswrapper[4807]: I1202 20:22:43.150965 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cc8fc8c44-b8pmd"] Dec 02 20:22:43 crc kubenswrapper[4807]: I1202 20:22:43.168011 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7cc8fc8c44-b8pmd"] Dec 02 20:22:43 crc kubenswrapper[4807]: I1202 20:22:43.175288 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-hrk4p" podStartSLOduration=2.28692256 podStartE2EDuration="11.175262643s" podCreationTimestamp="2025-12-02 20:22:32 +0000 UTC" firstStartedPulling="2025-12-02 20:22:33.147195087 +0000 UTC m=+1488.448102582" lastFinishedPulling="2025-12-02 20:22:42.03553515 +0000 UTC m=+1497.336442665" observedRunningTime="2025-12-02 20:22:43.142030583 +0000 UTC m=+1498.442938078" watchObservedRunningTime="2025-12-02 20:22:43.175262643 +0000 UTC m=+1498.476170148" Dec 02 20:22:43 crc kubenswrapper[4807]: I1202 20:22:43.360756 4807 scope.go:117] "RemoveContainer" containerID="acc5eee2fa3e2e060eea697bd28253234491412bf2a4bb3353d3abe041d69fa2" Dec 02 20:22:44 crc kubenswrapper[4807]: I1202 20:22:44.923067 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-52vl4"] Dec 02 20:22:44 crc kubenswrapper[4807]: E1202 20:22:44.924053 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62551774-7dc7-4727-a79d-f92d4f82d560" containerName="horizon" Dec 02 20:22:44 crc kubenswrapper[4807]: I1202 20:22:44.924071 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="62551774-7dc7-4727-a79d-f92d4f82d560" containerName="horizon" Dec 02 20:22:44 crc kubenswrapper[4807]: E1202 20:22:44.924081 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62551774-7dc7-4727-a79d-f92d4f82d560" containerName="horizon-log" Dec 02 20:22:44 crc kubenswrapper[4807]: I1202 20:22:44.924087 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="62551774-7dc7-4727-a79d-f92d4f82d560" containerName="horizon-log" Dec 02 20:22:44 crc kubenswrapper[4807]: I1202 20:22:44.924327 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="62551774-7dc7-4727-a79d-f92d4f82d560" containerName="horizon" Dec 02 20:22:44 crc kubenswrapper[4807]: I1202 20:22:44.924340 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="62551774-7dc7-4727-a79d-f92d4f82d560" containerName="horizon-log" Dec 02 20:22:44 crc kubenswrapper[4807]: I1202 20:22:44.924352 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="62551774-7dc7-4727-a79d-f92d4f82d560" containerName="horizon" Dec 02 20:22:44 crc kubenswrapper[4807]: E1202 20:22:44.924553 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62551774-7dc7-4727-a79d-f92d4f82d560" containerName="horizon" Dec 02 20:22:44 crc kubenswrapper[4807]: I1202 20:22:44.924564 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="62551774-7dc7-4727-a79d-f92d4f82d560" containerName="horizon" Dec 02 20:22:44 crc kubenswrapper[4807]: I1202 20:22:44.926476 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:45 crc kubenswrapper[4807]: I1202 20:22:45.082504 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62551774-7dc7-4727-a79d-f92d4f82d560" path="/var/lib/kubelet/pods/62551774-7dc7-4727-a79d-f92d4f82d560/volumes" Dec 02 20:22:45 crc kubenswrapper[4807]: I1202 20:22:45.083911 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-52vl4"] Dec 02 20:22:45 crc kubenswrapper[4807]: I1202 20:22:45.115407 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 20:22:45 crc kubenswrapper[4807]: I1202 20:22:45.115435 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 20:22:45 crc kubenswrapper[4807]: I1202 20:22:45.125875 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd2fe568-7055-4982-8449-d1d90258a2dc-utilities\") pod \"certified-operators-52vl4\" (UID: \"dd2fe568-7055-4982-8449-d1d90258a2dc\") " pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:45 crc kubenswrapper[4807]: I1202 20:22:45.125953 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7hqd\" (UniqueName: \"kubernetes.io/projected/dd2fe568-7055-4982-8449-d1d90258a2dc-kube-api-access-g7hqd\") pod \"certified-operators-52vl4\" (UID: \"dd2fe568-7055-4982-8449-d1d90258a2dc\") " pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:45 crc kubenswrapper[4807]: I1202 20:22:45.126421 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd2fe568-7055-4982-8449-d1d90258a2dc-catalog-content\") pod \"certified-operators-52vl4\" (UID: \"dd2fe568-7055-4982-8449-d1d90258a2dc\") " pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:45 crc kubenswrapper[4807]: I1202 20:22:45.229310 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd2fe568-7055-4982-8449-d1d90258a2dc-catalog-content\") pod \"certified-operators-52vl4\" (UID: \"dd2fe568-7055-4982-8449-d1d90258a2dc\") " pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:45 crc kubenswrapper[4807]: I1202 20:22:45.229548 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd2fe568-7055-4982-8449-d1d90258a2dc-utilities\") pod \"certified-operators-52vl4\" (UID: \"dd2fe568-7055-4982-8449-d1d90258a2dc\") " pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:45 crc kubenswrapper[4807]: I1202 20:22:45.230246 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7hqd\" (UniqueName: \"kubernetes.io/projected/dd2fe568-7055-4982-8449-d1d90258a2dc-kube-api-access-g7hqd\") pod \"certified-operators-52vl4\" (UID: \"dd2fe568-7055-4982-8449-d1d90258a2dc\") " pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:45 crc kubenswrapper[4807]: I1202 20:22:45.230252 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd2fe568-7055-4982-8449-d1d90258a2dc-utilities\") pod \"certified-operators-52vl4\" (UID: \"dd2fe568-7055-4982-8449-d1d90258a2dc\") " pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:45 crc kubenswrapper[4807]: I1202 20:22:45.231197 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd2fe568-7055-4982-8449-d1d90258a2dc-catalog-content\") pod \"certified-operators-52vl4\" (UID: \"dd2fe568-7055-4982-8449-d1d90258a2dc\") " pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:45 crc kubenswrapper[4807]: I1202 20:22:45.270873 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7hqd\" (UniqueName: \"kubernetes.io/projected/dd2fe568-7055-4982-8449-d1d90258a2dc-kube-api-access-g7hqd\") pod \"certified-operators-52vl4\" (UID: \"dd2fe568-7055-4982-8449-d1d90258a2dc\") " pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:45 crc kubenswrapper[4807]: I1202 20:22:45.566566 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:45 crc kubenswrapper[4807]: I1202 20:22:45.573889 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 20:22:45 crc kubenswrapper[4807]: I1202 20:22:45.574641 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 20:22:46 crc kubenswrapper[4807]: I1202 20:22:46.207582 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-52vl4"] Dec 02 20:22:47 crc kubenswrapper[4807]: I1202 20:22:47.143662 4807 generic.go:334] "Generic (PLEG): container finished" podID="dd2fe568-7055-4982-8449-d1d90258a2dc" containerID="a0bbe961e0887f4dfeb26a54208fe98d8107ebfe904762d4ff3197a0a97fb5c7" exitCode=0 Dec 02 20:22:47 crc kubenswrapper[4807]: I1202 20:22:47.143750 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52vl4" event={"ID":"dd2fe568-7055-4982-8449-d1d90258a2dc","Type":"ContainerDied","Data":"a0bbe961e0887f4dfeb26a54208fe98d8107ebfe904762d4ff3197a0a97fb5c7"} Dec 02 20:22:47 crc kubenswrapper[4807]: I1202 20:22:47.144323 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52vl4" event={"ID":"dd2fe568-7055-4982-8449-d1d90258a2dc","Type":"ContainerStarted","Data":"c02c4291be97a551d83059dffe4b5a6b83075df4b456841ac8e7052a8978169e"} Dec 02 20:22:48 crc kubenswrapper[4807]: I1202 20:22:48.158380 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52vl4" event={"ID":"dd2fe568-7055-4982-8449-d1d90258a2dc","Type":"ContainerStarted","Data":"af189bef77dde7ef192354ea021fd613121fee8123d5eaa51fc5a3067d15e673"} Dec 02 20:22:49 crc kubenswrapper[4807]: I1202 20:22:49.171205 4807 generic.go:334] "Generic (PLEG): container finished" podID="dd2fe568-7055-4982-8449-d1d90258a2dc" containerID="af189bef77dde7ef192354ea021fd613121fee8123d5eaa51fc5a3067d15e673" exitCode=0 Dec 02 20:22:49 crc kubenswrapper[4807]: I1202 20:22:49.171276 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52vl4" event={"ID":"dd2fe568-7055-4982-8449-d1d90258a2dc","Type":"ContainerDied","Data":"af189bef77dde7ef192354ea021fd613121fee8123d5eaa51fc5a3067d15e673"} Dec 02 20:22:51 crc kubenswrapper[4807]: I1202 20:22:51.199815 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52vl4" event={"ID":"dd2fe568-7055-4982-8449-d1d90258a2dc","Type":"ContainerStarted","Data":"e858f81927383dfeb6ecb095ef315298f2604893c757283d2b663b4ca5b05c1d"} Dec 02 20:22:51 crc kubenswrapper[4807]: I1202 20:22:51.251422 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-52vl4" podStartSLOduration=3.839571502 podStartE2EDuration="7.251379433s" podCreationTimestamp="2025-12-02 20:22:44 +0000 UTC" firstStartedPulling="2025-12-02 20:22:47.147228568 +0000 UTC m=+1502.448136063" lastFinishedPulling="2025-12-02 20:22:50.559036489 +0000 UTC m=+1505.859943994" observedRunningTime="2025-12-02 20:22:51.219365269 +0000 UTC m=+1506.520272774" watchObservedRunningTime="2025-12-02 20:22:51.251379433 +0000 UTC m=+1506.552286928" Dec 02 20:22:55 crc kubenswrapper[4807]: I1202 20:22:55.249016 4807 generic.go:334] "Generic (PLEG): container finished" podID="76d53985-4c04-4e9c-be07-b866ac014640" containerID="1a7ebd77d06c82058f56bc01c466308e693f119872c024b5d0f18cab460cfa11" exitCode=0 Dec 02 20:22:55 crc kubenswrapper[4807]: I1202 20:22:55.249108 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hrk4p" event={"ID":"76d53985-4c04-4e9c-be07-b866ac014640","Type":"ContainerDied","Data":"1a7ebd77d06c82058f56bc01c466308e693f119872c024b5d0f18cab460cfa11"} Dec 02 20:22:55 crc kubenswrapper[4807]: I1202 20:22:55.566926 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:55 crc kubenswrapper[4807]: I1202 20:22:55.567039 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:55 crc kubenswrapper[4807]: I1202 20:22:55.637066 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:56 crc kubenswrapper[4807]: I1202 20:22:56.341465 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:56 crc kubenswrapper[4807]: I1202 20:22:56.409127 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-52vl4"] Dec 02 20:22:56 crc kubenswrapper[4807]: I1202 20:22:56.696267 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hrk4p" Dec 02 20:22:56 crc kubenswrapper[4807]: I1202 20:22:56.835825 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-scripts\") pod \"76d53985-4c04-4e9c-be07-b866ac014640\" (UID: \"76d53985-4c04-4e9c-be07-b866ac014640\") " Dec 02 20:22:56 crc kubenswrapper[4807]: I1202 20:22:56.836428 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-combined-ca-bundle\") pod \"76d53985-4c04-4e9c-be07-b866ac014640\" (UID: \"76d53985-4c04-4e9c-be07-b866ac014640\") " Dec 02 20:22:56 crc kubenswrapper[4807]: I1202 20:22:56.836454 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-config-data\") pod \"76d53985-4c04-4e9c-be07-b866ac014640\" (UID: \"76d53985-4c04-4e9c-be07-b866ac014640\") " Dec 02 20:22:56 crc kubenswrapper[4807]: I1202 20:22:56.836505 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc92b\" (UniqueName: \"kubernetes.io/projected/76d53985-4c04-4e9c-be07-b866ac014640-kube-api-access-hc92b\") pod \"76d53985-4c04-4e9c-be07-b866ac014640\" (UID: \"76d53985-4c04-4e9c-be07-b866ac014640\") " Dec 02 20:22:56 crc kubenswrapper[4807]: I1202 20:22:56.844742 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-scripts" (OuterVolumeSpecName: "scripts") pod "76d53985-4c04-4e9c-be07-b866ac014640" (UID: "76d53985-4c04-4e9c-be07-b866ac014640"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:56 crc kubenswrapper[4807]: I1202 20:22:56.845093 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d53985-4c04-4e9c-be07-b866ac014640-kube-api-access-hc92b" (OuterVolumeSpecName: "kube-api-access-hc92b") pod "76d53985-4c04-4e9c-be07-b866ac014640" (UID: "76d53985-4c04-4e9c-be07-b866ac014640"). InnerVolumeSpecName "kube-api-access-hc92b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:22:56 crc kubenswrapper[4807]: I1202 20:22:56.883017 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-config-data" (OuterVolumeSpecName: "config-data") pod "76d53985-4c04-4e9c-be07-b866ac014640" (UID: "76d53985-4c04-4e9c-be07-b866ac014640"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:56 crc kubenswrapper[4807]: I1202 20:22:56.886931 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76d53985-4c04-4e9c-be07-b866ac014640" (UID: "76d53985-4c04-4e9c-be07-b866ac014640"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:22:56 crc kubenswrapper[4807]: I1202 20:22:56.940132 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:56 crc kubenswrapper[4807]: I1202 20:22:56.940228 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:56 crc kubenswrapper[4807]: I1202 20:22:56.940254 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc92b\" (UniqueName: \"kubernetes.io/projected/76d53985-4c04-4e9c-be07-b866ac014640-kube-api-access-hc92b\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:56 crc kubenswrapper[4807]: I1202 20:22:56.940283 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76d53985-4c04-4e9c-be07-b866ac014640-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.279375 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hrk4p" event={"ID":"76d53985-4c04-4e9c-be07-b866ac014640","Type":"ContainerDied","Data":"614671a58315301b18e664e8bcf25c87641cfc4870f1c901952bb1f3038e45da"} Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.279450 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hrk4p" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.279520 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="614671a58315301b18e664e8bcf25c87641cfc4870f1c901952bb1f3038e45da" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.467606 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 20:22:57 crc kubenswrapper[4807]: E1202 20:22:57.468345 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d53985-4c04-4e9c-be07-b866ac014640" containerName="nova-cell0-conductor-db-sync" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.468372 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d53985-4c04-4e9c-be07-b866ac014640" containerName="nova-cell0-conductor-db-sync" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.468742 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d53985-4c04-4e9c-be07-b866ac014640" containerName="nova-cell0-conductor-db-sync" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.469797 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.476258 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.477140 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-68rp7" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.481280 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.557040 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w7s6\" (UniqueName: \"kubernetes.io/projected/a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a-kube-api-access-9w7s6\") pod \"nova-cell0-conductor-0\" (UID: \"a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a\") " pod="openstack/nova-cell0-conductor-0" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.557127 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a\") " pod="openstack/nova-cell0-conductor-0" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.557213 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a\") " pod="openstack/nova-cell0-conductor-0" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.659155 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w7s6\" (UniqueName: \"kubernetes.io/projected/a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a-kube-api-access-9w7s6\") pod \"nova-cell0-conductor-0\" (UID: \"a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a\") " pod="openstack/nova-cell0-conductor-0" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.659263 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a\") " pod="openstack/nova-cell0-conductor-0" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.659365 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a\") " pod="openstack/nova-cell0-conductor-0" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.665453 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a\") " pod="openstack/nova-cell0-conductor-0" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.668217 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a\") " pod="openstack/nova-cell0-conductor-0" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.692825 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w7s6\" (UniqueName: \"kubernetes.io/projected/a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a-kube-api-access-9w7s6\") pod \"nova-cell0-conductor-0\" (UID: \"a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a\") " pod="openstack/nova-cell0-conductor-0" Dec 02 20:22:57 crc kubenswrapper[4807]: I1202 20:22:57.795041 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 20:22:58 crc kubenswrapper[4807]: I1202 20:22:58.289877 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-52vl4" podUID="dd2fe568-7055-4982-8449-d1d90258a2dc" containerName="registry-server" containerID="cri-o://e858f81927383dfeb6ecb095ef315298f2604893c757283d2b663b4ca5b05c1d" gracePeriod=2 Dec 02 20:22:58 crc kubenswrapper[4807]: I1202 20:22:58.313316 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 20:22:58 crc kubenswrapper[4807]: I1202 20:22:58.779522 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:58 crc kubenswrapper[4807]: I1202 20:22:58.893203 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd2fe568-7055-4982-8449-d1d90258a2dc-catalog-content\") pod \"dd2fe568-7055-4982-8449-d1d90258a2dc\" (UID: \"dd2fe568-7055-4982-8449-d1d90258a2dc\") " Dec 02 20:22:58 crc kubenswrapper[4807]: I1202 20:22:58.893285 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7hqd\" (UniqueName: \"kubernetes.io/projected/dd2fe568-7055-4982-8449-d1d90258a2dc-kube-api-access-g7hqd\") pod \"dd2fe568-7055-4982-8449-d1d90258a2dc\" (UID: \"dd2fe568-7055-4982-8449-d1d90258a2dc\") " Dec 02 20:22:58 crc kubenswrapper[4807]: I1202 20:22:58.894013 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd2fe568-7055-4982-8449-d1d90258a2dc-utilities\") pod \"dd2fe568-7055-4982-8449-d1d90258a2dc\" (UID: \"dd2fe568-7055-4982-8449-d1d90258a2dc\") " Dec 02 20:22:58 crc kubenswrapper[4807]: I1202 20:22:58.894944 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd2fe568-7055-4982-8449-d1d90258a2dc-utilities" (OuterVolumeSpecName: "utilities") pod "dd2fe568-7055-4982-8449-d1d90258a2dc" (UID: "dd2fe568-7055-4982-8449-d1d90258a2dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:22:58 crc kubenswrapper[4807]: I1202 20:22:58.895339 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd2fe568-7055-4982-8449-d1d90258a2dc-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:58 crc kubenswrapper[4807]: I1202 20:22:58.903049 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd2fe568-7055-4982-8449-d1d90258a2dc-kube-api-access-g7hqd" (OuterVolumeSpecName: "kube-api-access-g7hqd") pod "dd2fe568-7055-4982-8449-d1d90258a2dc" (UID: "dd2fe568-7055-4982-8449-d1d90258a2dc"). InnerVolumeSpecName "kube-api-access-g7hqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:22:58 crc kubenswrapper[4807]: I1202 20:22:58.941207 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd2fe568-7055-4982-8449-d1d90258a2dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd2fe568-7055-4982-8449-d1d90258a2dc" (UID: "dd2fe568-7055-4982-8449-d1d90258a2dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:22:58 crc kubenswrapper[4807]: I1202 20:22:58.997689 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd2fe568-7055-4982-8449-d1d90258a2dc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:58 crc kubenswrapper[4807]: I1202 20:22:58.997752 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7hqd\" (UniqueName: \"kubernetes.io/projected/dd2fe568-7055-4982-8449-d1d90258a2dc-kube-api-access-g7hqd\") on node \"crc\" DevicePath \"\"" Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.306397 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a","Type":"ContainerStarted","Data":"ef0b68d6abe3daec53a2d9e45064dcd223cd71707332499476270e3372a9c3ae"} Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.306486 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a","Type":"ContainerStarted","Data":"2f53d2aa10b350e466758c515d4759bc2dcc792bce76c7ad560f4938310c1316"} Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.306618 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.310956 4807 generic.go:334] "Generic (PLEG): container finished" podID="dd2fe568-7055-4982-8449-d1d90258a2dc" containerID="e858f81927383dfeb6ecb095ef315298f2604893c757283d2b663b4ca5b05c1d" exitCode=0 Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.311043 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52vl4" event={"ID":"dd2fe568-7055-4982-8449-d1d90258a2dc","Type":"ContainerDied","Data":"e858f81927383dfeb6ecb095ef315298f2604893c757283d2b663b4ca5b05c1d"} Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.311456 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52vl4" event={"ID":"dd2fe568-7055-4982-8449-d1d90258a2dc","Type":"ContainerDied","Data":"c02c4291be97a551d83059dffe4b5a6b83075df4b456841ac8e7052a8978169e"} Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.311062 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52vl4" Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.311502 4807 scope.go:117] "RemoveContainer" containerID="e858f81927383dfeb6ecb095ef315298f2604893c757283d2b663b4ca5b05c1d" Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.347218 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.347188311 podStartE2EDuration="2.347188311s" podCreationTimestamp="2025-12-02 20:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:22:59.332641548 +0000 UTC m=+1514.633549083" watchObservedRunningTime="2025-12-02 20:22:59.347188311 +0000 UTC m=+1514.648095806" Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.373223 4807 scope.go:117] "RemoveContainer" containerID="af189bef77dde7ef192354ea021fd613121fee8123d5eaa51fc5a3067d15e673" Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.380624 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-52vl4"] Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.393454 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-52vl4"] Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.414545 4807 scope.go:117] "RemoveContainer" containerID="a0bbe961e0887f4dfeb26a54208fe98d8107ebfe904762d4ff3197a0a97fb5c7" Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.484985 4807 scope.go:117] "RemoveContainer" containerID="e858f81927383dfeb6ecb095ef315298f2604893c757283d2b663b4ca5b05c1d" Dec 02 20:22:59 crc kubenswrapper[4807]: E1202 20:22:59.489315 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e858f81927383dfeb6ecb095ef315298f2604893c757283d2b663b4ca5b05c1d\": container with ID starting with e858f81927383dfeb6ecb095ef315298f2604893c757283d2b663b4ca5b05c1d not found: ID does not exist" containerID="e858f81927383dfeb6ecb095ef315298f2604893c757283d2b663b4ca5b05c1d" Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.489388 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e858f81927383dfeb6ecb095ef315298f2604893c757283d2b663b4ca5b05c1d"} err="failed to get container status \"e858f81927383dfeb6ecb095ef315298f2604893c757283d2b663b4ca5b05c1d\": rpc error: code = NotFound desc = could not find container \"e858f81927383dfeb6ecb095ef315298f2604893c757283d2b663b4ca5b05c1d\": container with ID starting with e858f81927383dfeb6ecb095ef315298f2604893c757283d2b663b4ca5b05c1d not found: ID does not exist" Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.489433 4807 scope.go:117] "RemoveContainer" containerID="af189bef77dde7ef192354ea021fd613121fee8123d5eaa51fc5a3067d15e673" Dec 02 20:22:59 crc kubenswrapper[4807]: E1202 20:22:59.490102 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af189bef77dde7ef192354ea021fd613121fee8123d5eaa51fc5a3067d15e673\": container with ID starting with af189bef77dde7ef192354ea021fd613121fee8123d5eaa51fc5a3067d15e673 not found: ID does not exist" containerID="af189bef77dde7ef192354ea021fd613121fee8123d5eaa51fc5a3067d15e673" Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.490134 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af189bef77dde7ef192354ea021fd613121fee8123d5eaa51fc5a3067d15e673"} err="failed to get container status \"af189bef77dde7ef192354ea021fd613121fee8123d5eaa51fc5a3067d15e673\": rpc error: code = NotFound desc = could not find container \"af189bef77dde7ef192354ea021fd613121fee8123d5eaa51fc5a3067d15e673\": container with ID starting with af189bef77dde7ef192354ea021fd613121fee8123d5eaa51fc5a3067d15e673 not found: ID does not exist" Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.490152 4807 scope.go:117] "RemoveContainer" containerID="a0bbe961e0887f4dfeb26a54208fe98d8107ebfe904762d4ff3197a0a97fb5c7" Dec 02 20:22:59 crc kubenswrapper[4807]: E1202 20:22:59.490428 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0bbe961e0887f4dfeb26a54208fe98d8107ebfe904762d4ff3197a0a97fb5c7\": container with ID starting with a0bbe961e0887f4dfeb26a54208fe98d8107ebfe904762d4ff3197a0a97fb5c7 not found: ID does not exist" containerID="a0bbe961e0887f4dfeb26a54208fe98d8107ebfe904762d4ff3197a0a97fb5c7" Dec 02 20:22:59 crc kubenswrapper[4807]: I1202 20:22:59.490461 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0bbe961e0887f4dfeb26a54208fe98d8107ebfe904762d4ff3197a0a97fb5c7"} err="failed to get container status \"a0bbe961e0887f4dfeb26a54208fe98d8107ebfe904762d4ff3197a0a97fb5c7\": rpc error: code = NotFound desc = could not find container \"a0bbe961e0887f4dfeb26a54208fe98d8107ebfe904762d4ff3197a0a97fb5c7\": container with ID starting with a0bbe961e0887f4dfeb26a54208fe98d8107ebfe904762d4ff3197a0a97fb5c7 not found: ID does not exist" Dec 02 20:23:00 crc kubenswrapper[4807]: I1202 20:23:00.593058 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 20:23:00 crc kubenswrapper[4807]: I1202 20:23:00.989838 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd2fe568-7055-4982-8449-d1d90258a2dc" path="/var/lib/kubelet/pods/dd2fe568-7055-4982-8449-d1d90258a2dc/volumes" Dec 02 20:23:04 crc kubenswrapper[4807]: I1202 20:23:04.855106 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 20:23:04 crc kubenswrapper[4807]: I1202 20:23:04.857466 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="7e167e24-edba-4ff8-8b31-2c7141238bde" containerName="kube-state-metrics" containerID="cri-o://1f01d08f38878ef4f03f695bea0bf8f233d60d8b3243cc825254999bc057a8d7" gracePeriod=30 Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.363151 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.408748 4807 generic.go:334] "Generic (PLEG): container finished" podID="7e167e24-edba-4ff8-8b31-2c7141238bde" containerID="1f01d08f38878ef4f03f695bea0bf8f233d60d8b3243cc825254999bc057a8d7" exitCode=2 Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.408833 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7e167e24-edba-4ff8-8b31-2c7141238bde","Type":"ContainerDied","Data":"1f01d08f38878ef4f03f695bea0bf8f233d60d8b3243cc825254999bc057a8d7"} Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.408874 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7e167e24-edba-4ff8-8b31-2c7141238bde","Type":"ContainerDied","Data":"aaeff63e620a323a30c4c1823703dee029930d029611990d40d66642df0ebbf2"} Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.408905 4807 scope.go:117] "RemoveContainer" containerID="1f01d08f38878ef4f03f695bea0bf8f233d60d8b3243cc825254999bc057a8d7" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.408889 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.442088 4807 scope.go:117] "RemoveContainer" containerID="1f01d08f38878ef4f03f695bea0bf8f233d60d8b3243cc825254999bc057a8d7" Dec 02 20:23:05 crc kubenswrapper[4807]: E1202 20:23:05.442769 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f01d08f38878ef4f03f695bea0bf8f233d60d8b3243cc825254999bc057a8d7\": container with ID starting with 1f01d08f38878ef4f03f695bea0bf8f233d60d8b3243cc825254999bc057a8d7 not found: ID does not exist" containerID="1f01d08f38878ef4f03f695bea0bf8f233d60d8b3243cc825254999bc057a8d7" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.442810 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f01d08f38878ef4f03f695bea0bf8f233d60d8b3243cc825254999bc057a8d7"} err="failed to get container status \"1f01d08f38878ef4f03f695bea0bf8f233d60d8b3243cc825254999bc057a8d7\": rpc error: code = NotFound desc = could not find container \"1f01d08f38878ef4f03f695bea0bf8f233d60d8b3243cc825254999bc057a8d7\": container with ID starting with 1f01d08f38878ef4f03f695bea0bf8f233d60d8b3243cc825254999bc057a8d7 not found: ID does not exist" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.497241 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9bkw\" (UniqueName: \"kubernetes.io/projected/7e167e24-edba-4ff8-8b31-2c7141238bde-kube-api-access-d9bkw\") pod \"7e167e24-edba-4ff8-8b31-2c7141238bde\" (UID: \"7e167e24-edba-4ff8-8b31-2c7141238bde\") " Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.504229 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e167e24-edba-4ff8-8b31-2c7141238bde-kube-api-access-d9bkw" (OuterVolumeSpecName: "kube-api-access-d9bkw") pod "7e167e24-edba-4ff8-8b31-2c7141238bde" (UID: "7e167e24-edba-4ff8-8b31-2c7141238bde"). InnerVolumeSpecName "kube-api-access-d9bkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.600496 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9bkw\" (UniqueName: \"kubernetes.io/projected/7e167e24-edba-4ff8-8b31-2c7141238bde-kube-api-access-d9bkw\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.753623 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.762836 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.785352 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 20:23:05 crc kubenswrapper[4807]: E1202 20:23:05.786510 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd2fe568-7055-4982-8449-d1d90258a2dc" containerName="extract-utilities" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.786666 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd2fe568-7055-4982-8449-d1d90258a2dc" containerName="extract-utilities" Dec 02 20:23:05 crc kubenswrapper[4807]: E1202 20:23:05.786785 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd2fe568-7055-4982-8449-d1d90258a2dc" containerName="registry-server" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.786868 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd2fe568-7055-4982-8449-d1d90258a2dc" containerName="registry-server" Dec 02 20:23:05 crc kubenswrapper[4807]: E1202 20:23:05.786971 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e167e24-edba-4ff8-8b31-2c7141238bde" containerName="kube-state-metrics" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.787048 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e167e24-edba-4ff8-8b31-2c7141238bde" containerName="kube-state-metrics" Dec 02 20:23:05 crc kubenswrapper[4807]: E1202 20:23:05.787141 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd2fe568-7055-4982-8449-d1d90258a2dc" containerName="extract-content" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.787207 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd2fe568-7055-4982-8449-d1d90258a2dc" containerName="extract-content" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.787556 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd2fe568-7055-4982-8449-d1d90258a2dc" containerName="registry-server" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.787657 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e167e24-edba-4ff8-8b31-2c7141238bde" containerName="kube-state-metrics" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.789022 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.791789 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.792026 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.802457 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.906358 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ghfg\" (UniqueName: \"kubernetes.io/projected/a603da43-4e3b-4e75-8c4e-9e90908e2af4-kube-api-access-8ghfg\") pod \"kube-state-metrics-0\" (UID: \"a603da43-4e3b-4e75-8c4e-9e90908e2af4\") " pod="openstack/kube-state-metrics-0" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.907804 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a603da43-4e3b-4e75-8c4e-9e90908e2af4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a603da43-4e3b-4e75-8c4e-9e90908e2af4\") " pod="openstack/kube-state-metrics-0" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.907925 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a603da43-4e3b-4e75-8c4e-9e90908e2af4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a603da43-4e3b-4e75-8c4e-9e90908e2af4\") " pod="openstack/kube-state-metrics-0" Dec 02 20:23:05 crc kubenswrapper[4807]: I1202 20:23:05.907961 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a603da43-4e3b-4e75-8c4e-9e90908e2af4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a603da43-4e3b-4e75-8c4e-9e90908e2af4\") " pod="openstack/kube-state-metrics-0" Dec 02 20:23:06 crc kubenswrapper[4807]: I1202 20:23:06.010444 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ghfg\" (UniqueName: \"kubernetes.io/projected/a603da43-4e3b-4e75-8c4e-9e90908e2af4-kube-api-access-8ghfg\") pod \"kube-state-metrics-0\" (UID: \"a603da43-4e3b-4e75-8c4e-9e90908e2af4\") " pod="openstack/kube-state-metrics-0" Dec 02 20:23:06 crc kubenswrapper[4807]: I1202 20:23:06.010814 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a603da43-4e3b-4e75-8c4e-9e90908e2af4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a603da43-4e3b-4e75-8c4e-9e90908e2af4\") " pod="openstack/kube-state-metrics-0" Dec 02 20:23:06 crc kubenswrapper[4807]: I1202 20:23:06.011002 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a603da43-4e3b-4e75-8c4e-9e90908e2af4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a603da43-4e3b-4e75-8c4e-9e90908e2af4\") " pod="openstack/kube-state-metrics-0" Dec 02 20:23:06 crc kubenswrapper[4807]: I1202 20:23:06.011157 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a603da43-4e3b-4e75-8c4e-9e90908e2af4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a603da43-4e3b-4e75-8c4e-9e90908e2af4\") " pod="openstack/kube-state-metrics-0" Dec 02 20:23:06 crc kubenswrapper[4807]: I1202 20:23:06.017010 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a603da43-4e3b-4e75-8c4e-9e90908e2af4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a603da43-4e3b-4e75-8c4e-9e90908e2af4\") " pod="openstack/kube-state-metrics-0" Dec 02 20:23:06 crc kubenswrapper[4807]: I1202 20:23:06.018990 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a603da43-4e3b-4e75-8c4e-9e90908e2af4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a603da43-4e3b-4e75-8c4e-9e90908e2af4\") " pod="openstack/kube-state-metrics-0" Dec 02 20:23:06 crc kubenswrapper[4807]: I1202 20:23:06.025204 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a603da43-4e3b-4e75-8c4e-9e90908e2af4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a603da43-4e3b-4e75-8c4e-9e90908e2af4\") " pod="openstack/kube-state-metrics-0" Dec 02 20:23:06 crc kubenswrapper[4807]: I1202 20:23:06.035736 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ghfg\" (UniqueName: \"kubernetes.io/projected/a603da43-4e3b-4e75-8c4e-9e90908e2af4-kube-api-access-8ghfg\") pod \"kube-state-metrics-0\" (UID: \"a603da43-4e3b-4e75-8c4e-9e90908e2af4\") " pod="openstack/kube-state-metrics-0" Dec 02 20:23:06 crc kubenswrapper[4807]: I1202 20:23:06.125586 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 20:23:06 crc kubenswrapper[4807]: I1202 20:23:06.726769 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 20:23:06 crc kubenswrapper[4807]: I1202 20:23:06.986915 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e167e24-edba-4ff8-8b31-2c7141238bde" path="/var/lib/kubelet/pods/7e167e24-edba-4ff8-8b31-2c7141238bde/volumes" Dec 02 20:23:07 crc kubenswrapper[4807]: I1202 20:23:07.138626 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:23:07 crc kubenswrapper[4807]: I1202 20:23:07.139264 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerName="ceilometer-central-agent" containerID="cri-o://e9c8f133b651da3a5e86290625a32654018cfbf1e01d9cd3c69dad1b0a5b7f8c" gracePeriod=30 Dec 02 20:23:07 crc kubenswrapper[4807]: I1202 20:23:07.139447 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerName="ceilometer-notification-agent" containerID="cri-o://5d71ed41c67e0ed0ac561b6a856bf866a02ddad283f9c004ba2b6c5703768ec1" gracePeriod=30 Dec 02 20:23:07 crc kubenswrapper[4807]: I1202 20:23:07.139469 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerName="sg-core" containerID="cri-o://a5993e5be6b251561dfb29bbd31d235e9f97aa5e214cf33897dd70450d334ceb" gracePeriod=30 Dec 02 20:23:07 crc kubenswrapper[4807]: I1202 20:23:07.139695 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerName="proxy-httpd" containerID="cri-o://209e9c29c6bc36bfec8cc3c4d028dde495ae8fb0a350449a9badd2b82a67d037" gracePeriod=30 Dec 02 20:23:07 crc kubenswrapper[4807]: I1202 20:23:07.450341 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a603da43-4e3b-4e75-8c4e-9e90908e2af4","Type":"ContainerStarted","Data":"0f97e1b66052c251b2171d76cc350393e9174e438877c8dc066044e5639a7742"} Dec 02 20:23:07 crc kubenswrapper[4807]: I1202 20:23:07.451034 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 20:23:07 crc kubenswrapper[4807]: I1202 20:23:07.451128 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a603da43-4e3b-4e75-8c4e-9e90908e2af4","Type":"ContainerStarted","Data":"ac8aa21f5eb12e977610b02168b75025bc7940f783870e9080b85a928eb7ad05"} Dec 02 20:23:07 crc kubenswrapper[4807]: I1202 20:23:07.461201 4807 generic.go:334] "Generic (PLEG): container finished" podID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerID="209e9c29c6bc36bfec8cc3c4d028dde495ae8fb0a350449a9badd2b82a67d037" exitCode=0 Dec 02 20:23:07 crc kubenswrapper[4807]: I1202 20:23:07.461243 4807 generic.go:334] "Generic (PLEG): container finished" podID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerID="a5993e5be6b251561dfb29bbd31d235e9f97aa5e214cf33897dd70450d334ceb" exitCode=2 Dec 02 20:23:07 crc kubenswrapper[4807]: I1202 20:23:07.461276 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b","Type":"ContainerDied","Data":"209e9c29c6bc36bfec8cc3c4d028dde495ae8fb0a350449a9badd2b82a67d037"} Dec 02 20:23:07 crc kubenswrapper[4807]: I1202 20:23:07.461310 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b","Type":"ContainerDied","Data":"a5993e5be6b251561dfb29bbd31d235e9f97aa5e214cf33897dd70450d334ceb"} Dec 02 20:23:07 crc kubenswrapper[4807]: I1202 20:23:07.490093 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.144136605 podStartE2EDuration="2.490059521s" podCreationTimestamp="2025-12-02 20:23:05 +0000 UTC" firstStartedPulling="2025-12-02 20:23:06.722477914 +0000 UTC m=+1522.023385409" lastFinishedPulling="2025-12-02 20:23:07.06840083 +0000 UTC m=+1522.369308325" observedRunningTime="2025-12-02 20:23:07.472948211 +0000 UTC m=+1522.773855726" watchObservedRunningTime="2025-12-02 20:23:07.490059521 +0000 UTC m=+1522.790967026" Dec 02 20:23:07 crc kubenswrapper[4807]: I1202 20:23:07.834353 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.529759 4807 generic.go:334] "Generic (PLEG): container finished" podID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerID="e9c8f133b651da3a5e86290625a32654018cfbf1e01d9cd3c69dad1b0a5b7f8c" exitCode=0 Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.530543 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b","Type":"ContainerDied","Data":"e9c8f133b651da3a5e86290625a32654018cfbf1e01d9cd3c69dad1b0a5b7f8c"} Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.530618 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-9z4k8"] Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.532763 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9z4k8" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.538126 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.538406 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.550006 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9z4k8"] Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.681011 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-config-data\") pod \"nova-cell0-cell-mapping-9z4k8\" (UID: \"18c173d1-798c-4f98-bfe3-0251b7a19403\") " pod="openstack/nova-cell0-cell-mapping-9z4k8" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.681496 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xc45\" (UniqueName: \"kubernetes.io/projected/18c173d1-798c-4f98-bfe3-0251b7a19403-kube-api-access-5xc45\") pod \"nova-cell0-cell-mapping-9z4k8\" (UID: \"18c173d1-798c-4f98-bfe3-0251b7a19403\") " pod="openstack/nova-cell0-cell-mapping-9z4k8" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.682274 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9z4k8\" (UID: \"18c173d1-798c-4f98-bfe3-0251b7a19403\") " pod="openstack/nova-cell0-cell-mapping-9z4k8" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.682560 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-scripts\") pod \"nova-cell0-cell-mapping-9z4k8\" (UID: \"18c173d1-798c-4f98-bfe3-0251b7a19403\") " pod="openstack/nova-cell0-cell-mapping-9z4k8" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.702534 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.704761 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.711340 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.729915 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.785408 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xc45\" (UniqueName: \"kubernetes.io/projected/18c173d1-798c-4f98-bfe3-0251b7a19403-kube-api-access-5xc45\") pod \"nova-cell0-cell-mapping-9z4k8\" (UID: \"18c173d1-798c-4f98-bfe3-0251b7a19403\") " pod="openstack/nova-cell0-cell-mapping-9z4k8" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.785979 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7142a8b-a137-4559-a955-9544266902a6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e7142a8b-a137-4559-a955-9544266902a6\") " pod="openstack/nova-api-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.786077 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9z4k8\" (UID: \"18c173d1-798c-4f98-bfe3-0251b7a19403\") " pod="openstack/nova-cell0-cell-mapping-9z4k8" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.786208 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7r2w\" (UniqueName: \"kubernetes.io/projected/e7142a8b-a137-4559-a955-9544266902a6-kube-api-access-w7r2w\") pod \"nova-api-0\" (UID: \"e7142a8b-a137-4559-a955-9544266902a6\") " pod="openstack/nova-api-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.786355 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7142a8b-a137-4559-a955-9544266902a6-logs\") pod \"nova-api-0\" (UID: \"e7142a8b-a137-4559-a955-9544266902a6\") " pod="openstack/nova-api-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.786453 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-scripts\") pod \"nova-cell0-cell-mapping-9z4k8\" (UID: \"18c173d1-798c-4f98-bfe3-0251b7a19403\") " pod="openstack/nova-cell0-cell-mapping-9z4k8" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.786535 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7142a8b-a137-4559-a955-9544266902a6-config-data\") pod \"nova-api-0\" (UID: \"e7142a8b-a137-4559-a955-9544266902a6\") " pod="openstack/nova-api-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.786651 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-config-data\") pod \"nova-cell0-cell-mapping-9z4k8\" (UID: \"18c173d1-798c-4f98-bfe3-0251b7a19403\") " pod="openstack/nova-cell0-cell-mapping-9z4k8" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.802447 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9z4k8\" (UID: \"18c173d1-798c-4f98-bfe3-0251b7a19403\") " pod="openstack/nova-cell0-cell-mapping-9z4k8" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.802689 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-scripts\") pod \"nova-cell0-cell-mapping-9z4k8\" (UID: \"18c173d1-798c-4f98-bfe3-0251b7a19403\") " pod="openstack/nova-cell0-cell-mapping-9z4k8" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.825125 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.827434 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.830549 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-config-data\") pod \"nova-cell0-cell-mapping-9z4k8\" (UID: \"18c173d1-798c-4f98-bfe3-0251b7a19403\") " pod="openstack/nova-cell0-cell-mapping-9z4k8" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.844334 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.848354 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.857187 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xc45\" (UniqueName: \"kubernetes.io/projected/18c173d1-798c-4f98-bfe3-0251b7a19403-kube-api-access-5xc45\") pod \"nova-cell0-cell-mapping-9z4k8\" (UID: \"18c173d1-798c-4f98-bfe3-0251b7a19403\") " pod="openstack/nova-cell0-cell-mapping-9z4k8" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.857639 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.858112 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.866142 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9z4k8" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.891925 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad4e44c-eb67-43d9-b503-48692a4688e0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0ad4e44c-eb67-43d9-b503-48692a4688e0\") " pod="openstack/nova-metadata-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.891995 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7142a8b-a137-4559-a955-9544266902a6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e7142a8b-a137-4559-a955-9544266902a6\") " pod="openstack/nova-api-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.892050 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad4e44c-eb67-43d9-b503-48692a4688e0-logs\") pod \"nova-metadata-0\" (UID: \"0ad4e44c-eb67-43d9-b503-48692a4688e0\") " pod="openstack/nova-metadata-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.892082 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7r2w\" (UniqueName: \"kubernetes.io/projected/e7142a8b-a137-4559-a955-9544266902a6-kube-api-access-w7r2w\") pod \"nova-api-0\" (UID: \"e7142a8b-a137-4559-a955-9544266902a6\") " pod="openstack/nova-api-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.892113 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad4e44c-eb67-43d9-b503-48692a4688e0-config-data\") pod \"nova-metadata-0\" (UID: \"0ad4e44c-eb67-43d9-b503-48692a4688e0\") " pod="openstack/nova-metadata-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.892145 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c274t\" (UniqueName: \"kubernetes.io/projected/0ad4e44c-eb67-43d9-b503-48692a4688e0-kube-api-access-c274t\") pod \"nova-metadata-0\" (UID: \"0ad4e44c-eb67-43d9-b503-48692a4688e0\") " pod="openstack/nova-metadata-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.892186 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7142a8b-a137-4559-a955-9544266902a6-logs\") pod \"nova-api-0\" (UID: \"e7142a8b-a137-4559-a955-9544266902a6\") " pod="openstack/nova-api-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.892217 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7142a8b-a137-4559-a955-9544266902a6-config-data\") pod \"nova-api-0\" (UID: \"e7142a8b-a137-4559-a955-9544266902a6\") " pod="openstack/nova-api-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.897655 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7142a8b-a137-4559-a955-9544266902a6-logs\") pod \"nova-api-0\" (UID: \"e7142a8b-a137-4559-a955-9544266902a6\") " pod="openstack/nova-api-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.906253 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7142a8b-a137-4559-a955-9544266902a6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e7142a8b-a137-4559-a955-9544266902a6\") " pod="openstack/nova-api-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.915413 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7142a8b-a137-4559-a955-9544266902a6-config-data\") pod \"nova-api-0\" (UID: \"e7142a8b-a137-4559-a955-9544266902a6\") " pod="openstack/nova-api-0" Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.920936 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:23:08 crc kubenswrapper[4807]: I1202 20:23:08.928666 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7r2w\" (UniqueName: \"kubernetes.io/projected/e7142a8b-a137-4559-a955-9544266902a6-kube-api-access-w7r2w\") pod \"nova-api-0\" (UID: \"e7142a8b-a137-4559-a955-9544266902a6\") " pod="openstack/nova-api-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.017571 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.044009 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.141271 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad4e44c-eb67-43d9-b503-48692a4688e0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0ad4e44c-eb67-43d9-b503-48692a4688e0\") " pod="openstack/nova-metadata-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.141545 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad4e44c-eb67-43d9-b503-48692a4688e0-logs\") pod \"nova-metadata-0\" (UID: \"0ad4e44c-eb67-43d9-b503-48692a4688e0\") " pod="openstack/nova-metadata-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.141661 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a435bd-d058-4ef5-a457-e23bcccfb168-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9a435bd-d058-4ef5-a457-e23bcccfb168\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.141805 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad4e44c-eb67-43d9-b503-48692a4688e0-config-data\") pod \"nova-metadata-0\" (UID: \"0ad4e44c-eb67-43d9-b503-48692a4688e0\") " pod="openstack/nova-metadata-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.141894 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a435bd-d058-4ef5-a457-e23bcccfb168-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9a435bd-d058-4ef5-a457-e23bcccfb168\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.141949 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c274t\" (UniqueName: \"kubernetes.io/projected/0ad4e44c-eb67-43d9-b503-48692a4688e0-kube-api-access-c274t\") pod \"nova-metadata-0\" (UID: \"0ad4e44c-eb67-43d9-b503-48692a4688e0\") " pod="openstack/nova-metadata-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.141996 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v4th\" (UniqueName: \"kubernetes.io/projected/b9a435bd-d058-4ef5-a457-e23bcccfb168-kube-api-access-4v4th\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9a435bd-d058-4ef5-a457-e23bcccfb168\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.143509 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad4e44c-eb67-43d9-b503-48692a4688e0-logs\") pod \"nova-metadata-0\" (UID: \"0ad4e44c-eb67-43d9-b503-48692a4688e0\") " pod="openstack/nova-metadata-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.181121 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad4e44c-eb67-43d9-b503-48692a4688e0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0ad4e44c-eb67-43d9-b503-48692a4688e0\") " pod="openstack/nova-metadata-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.195546 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad4e44c-eb67-43d9-b503-48692a4688e0-config-data\") pod \"nova-metadata-0\" (UID: \"0ad4e44c-eb67-43d9-b503-48692a4688e0\") " pod="openstack/nova-metadata-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.197788 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c274t\" (UniqueName: \"kubernetes.io/projected/0ad4e44c-eb67-43d9-b503-48692a4688e0-kube-api-access-c274t\") pod \"nova-metadata-0\" (UID: \"0ad4e44c-eb67-43d9-b503-48692a4688e0\") " pod="openstack/nova-metadata-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.220847 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.223076 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.228898 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.246975 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a435bd-d058-4ef5-a457-e23bcccfb168-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9a435bd-d058-4ef5-a457-e23bcccfb168\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.247078 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a435bd-d058-4ef5-a457-e23bcccfb168-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9a435bd-d058-4ef5-a457-e23bcccfb168\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.247108 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v4th\" (UniqueName: \"kubernetes.io/projected/b9a435bd-d058-4ef5-a457-e23bcccfb168-kube-api-access-4v4th\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9a435bd-d058-4ef5-a457-e23bcccfb168\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.251551 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a435bd-d058-4ef5-a457-e23bcccfb168-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9a435bd-d058-4ef5-a457-e23bcccfb168\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.260684 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a435bd-d058-4ef5-a457-e23bcccfb168-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9a435bd-d058-4ef5-a457-e23bcccfb168\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.273453 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.287824 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-c7m8h"] Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.289938 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.298432 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v4th\" (UniqueName: \"kubernetes.io/projected/b9a435bd-d058-4ef5-a457-e23bcccfb168-kube-api-access-4v4th\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9a435bd-d058-4ef5-a457-e23bcccfb168\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.306824 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-c7m8h"] Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.334293 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.358482 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.360500 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c5fea6-3631-42fa-9e77-7b4f1e714aec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"34c5fea6-3631-42fa-9e77-7b4f1e714aec\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.360623 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.360654 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54mn8\" (UniqueName: \"kubernetes.io/projected/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-kube-api-access-54mn8\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.360705 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-dns-svc\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.360805 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.361049 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-config\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.361279 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.361325 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c5fea6-3631-42fa-9e77-7b4f1e714aec-config-data\") pod \"nova-scheduler-0\" (UID: \"34c5fea6-3631-42fa-9e77-7b4f1e714aec\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.361358 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwcb4\" (UniqueName: \"kubernetes.io/projected/34c5fea6-3631-42fa-9e77-7b4f1e714aec-kube-api-access-pwcb4\") pod \"nova-scheduler-0\" (UID: \"34c5fea6-3631-42fa-9e77-7b4f1e714aec\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.466037 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c5fea6-3631-42fa-9e77-7b4f1e714aec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"34c5fea6-3631-42fa-9e77-7b4f1e714aec\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.466112 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.466133 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54mn8\" (UniqueName: \"kubernetes.io/projected/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-kube-api-access-54mn8\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.466165 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-dns-svc\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.466800 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.466939 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-config\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.467042 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.467072 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c5fea6-3631-42fa-9e77-7b4f1e714aec-config-data\") pod \"nova-scheduler-0\" (UID: \"34c5fea6-3631-42fa-9e77-7b4f1e714aec\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.467101 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwcb4\" (UniqueName: \"kubernetes.io/projected/34c5fea6-3631-42fa-9e77-7b4f1e714aec-kube-api-access-pwcb4\") pod \"nova-scheduler-0\" (UID: \"34c5fea6-3631-42fa-9e77-7b4f1e714aec\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.467316 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-dns-svc\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.468944 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-config\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.472891 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.482609 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.485843 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.496034 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c5fea6-3631-42fa-9e77-7b4f1e714aec-config-data\") pod \"nova-scheduler-0\" (UID: \"34c5fea6-3631-42fa-9e77-7b4f1e714aec\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.510797 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c5fea6-3631-42fa-9e77-7b4f1e714aec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"34c5fea6-3631-42fa-9e77-7b4f1e714aec\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.512008 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54mn8\" (UniqueName: \"kubernetes.io/projected/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-kube-api-access-54mn8\") pod \"dnsmasq-dns-bccf8f775-c7m8h\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.532827 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwcb4\" (UniqueName: \"kubernetes.io/projected/34c5fea6-3631-42fa-9e77-7b4f1e714aec-kube-api-access-pwcb4\") pod \"nova-scheduler-0\" (UID: \"34c5fea6-3631-42fa-9e77-7b4f1e714aec\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.533837 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:09 crc kubenswrapper[4807]: I1202 20:23:09.718495 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.063956 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.330567 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-j86hv"] Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.332479 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-j86hv" Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.348927 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.350051 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.356657 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9z4k8"] Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.385649 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-j86hv"] Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.414943 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:23:10 crc kubenswrapper[4807]: W1202 20:23:10.425618 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9a435bd_d058_4ef5_a457_e23bcccfb168.slice/crio-0ea22dd8358dd5cb482dcdde7d2d072844fbbbab9f0fc468f63ba11c9469c2b9 WatchSource:0}: Error finding container 0ea22dd8358dd5cb482dcdde7d2d072844fbbbab9f0fc468f63ba11c9469c2b9: Status 404 returned error can't find the container with id 0ea22dd8358dd5cb482dcdde7d2d072844fbbbab9f0fc468f63ba11c9469c2b9 Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.449041 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.521909 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-j86hv\" (UID: \"42be5b49-eca1-4532-8742-3dfbb8a4f910\") " pod="openstack/nova-cell1-conductor-db-sync-j86hv" Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.522111 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2plrk\" (UniqueName: \"kubernetes.io/projected/42be5b49-eca1-4532-8742-3dfbb8a4f910-kube-api-access-2plrk\") pod \"nova-cell1-conductor-db-sync-j86hv\" (UID: \"42be5b49-eca1-4532-8742-3dfbb8a4f910\") " pod="openstack/nova-cell1-conductor-db-sync-j86hv" Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.522165 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-scripts\") pod \"nova-cell1-conductor-db-sync-j86hv\" (UID: \"42be5b49-eca1-4532-8742-3dfbb8a4f910\") " pod="openstack/nova-cell1-conductor-db-sync-j86hv" Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.522194 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-config-data\") pod \"nova-cell1-conductor-db-sync-j86hv\" (UID: \"42be5b49-eca1-4532-8742-3dfbb8a4f910\") " pod="openstack/nova-cell1-conductor-db-sync-j86hv" Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.557267 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-c7m8h"] Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.567803 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 20:23:10 crc kubenswrapper[4807]: W1202 20:23:10.577486 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2148eff_b1b3_45a5_9e9e_0769521c4cb7.slice/crio-b9444278b2ff43dfc57fb14ec5fa09d67108b967bceb1b4734a0cc321c807997 WatchSource:0}: Error finding container b9444278b2ff43dfc57fb14ec5fa09d67108b967bceb1b4734a0cc321c807997: Status 404 returned error can't find the container with id b9444278b2ff43dfc57fb14ec5fa09d67108b967bceb1b4734a0cc321c807997 Dec 02 20:23:10 crc kubenswrapper[4807]: W1202 20:23:10.596741 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34c5fea6_3631_42fa_9e77_7b4f1e714aec.slice/crio-5118cdd885e702f844eae4352be8faeb668a9527da1e03e93b003f0d192cf932 WatchSource:0}: Error finding container 5118cdd885e702f844eae4352be8faeb668a9527da1e03e93b003f0d192cf932: Status 404 returned error can't find the container with id 5118cdd885e702f844eae4352be8faeb668a9527da1e03e93b003f0d192cf932 Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.612195 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9z4k8" event={"ID":"18c173d1-798c-4f98-bfe3-0251b7a19403","Type":"ContainerStarted","Data":"96fdf9c938ed1d88a87bce9e40f5265a73c6bea2e554b259cce94eb0e393d554"} Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.616457 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ad4e44c-eb67-43d9-b503-48692a4688e0","Type":"ContainerStarted","Data":"f05a31f263154b91e424a079375d9dc29ac640b4d44289535a77c022b94ec88c"} Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.624359 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-scripts\") pod \"nova-cell1-conductor-db-sync-j86hv\" (UID: \"42be5b49-eca1-4532-8742-3dfbb8a4f910\") " pod="openstack/nova-cell1-conductor-db-sync-j86hv" Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.624432 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-config-data\") pod \"nova-cell1-conductor-db-sync-j86hv\" (UID: \"42be5b49-eca1-4532-8742-3dfbb8a4f910\") " pod="openstack/nova-cell1-conductor-db-sync-j86hv" Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.624505 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-j86hv\" (UID: \"42be5b49-eca1-4532-8742-3dfbb8a4f910\") " pod="openstack/nova-cell1-conductor-db-sync-j86hv" Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.624514 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" event={"ID":"d2148eff-b1b3-45a5-9e9e-0769521c4cb7","Type":"ContainerStarted","Data":"b9444278b2ff43dfc57fb14ec5fa09d67108b967bceb1b4734a0cc321c807997"} Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.624626 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2plrk\" (UniqueName: \"kubernetes.io/projected/42be5b49-eca1-4532-8742-3dfbb8a4f910-kube-api-access-2plrk\") pod \"nova-cell1-conductor-db-sync-j86hv\" (UID: \"42be5b49-eca1-4532-8742-3dfbb8a4f910\") " pod="openstack/nova-cell1-conductor-db-sync-j86hv" Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.630178 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9a435bd-d058-4ef5-a457-e23bcccfb168","Type":"ContainerStarted","Data":"0ea22dd8358dd5cb482dcdde7d2d072844fbbbab9f0fc468f63ba11c9469c2b9"} Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.630892 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-j86hv\" (UID: \"42be5b49-eca1-4532-8742-3dfbb8a4f910\") " pod="openstack/nova-cell1-conductor-db-sync-j86hv" Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.631255 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-scripts\") pod \"nova-cell1-conductor-db-sync-j86hv\" (UID: \"42be5b49-eca1-4532-8742-3dfbb8a4f910\") " pod="openstack/nova-cell1-conductor-db-sync-j86hv" Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.632458 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-config-data\") pod \"nova-cell1-conductor-db-sync-j86hv\" (UID: \"42be5b49-eca1-4532-8742-3dfbb8a4f910\") " pod="openstack/nova-cell1-conductor-db-sync-j86hv" Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.638001 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7142a8b-a137-4559-a955-9544266902a6","Type":"ContainerStarted","Data":"9728e41a1d04687e855d26a682698f007bdaed95c75ccc4697ea20bbdca2838d"} Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.650187 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2plrk\" (UniqueName: \"kubernetes.io/projected/42be5b49-eca1-4532-8742-3dfbb8a4f910-kube-api-access-2plrk\") pod \"nova-cell1-conductor-db-sync-j86hv\" (UID: \"42be5b49-eca1-4532-8742-3dfbb8a4f910\") " pod="openstack/nova-cell1-conductor-db-sync-j86hv" Dec 02 20:23:10 crc kubenswrapper[4807]: I1202 20:23:10.672134 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-j86hv" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.204967 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-j86hv"] Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.466316 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.566856 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-sg-core-conf-yaml\") pod \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.566942 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-run-httpd\") pod \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.566988 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-log-httpd\") pod \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.567066 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-scripts\") pod \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.567245 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-combined-ca-bundle\") pod \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.567286 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt8ph\" (UniqueName: \"kubernetes.io/projected/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-kube-api-access-lt8ph\") pod \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.567420 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-config-data\") pod \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\" (UID: \"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b\") " Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.569945 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" (UID: "ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.570303 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" (UID: "ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.581033 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-kube-api-access-lt8ph" (OuterVolumeSpecName: "kube-api-access-lt8ph") pod "ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" (UID: "ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b"). InnerVolumeSpecName "kube-api-access-lt8ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.581497 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-scripts" (OuterVolumeSpecName: "scripts") pod "ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" (UID: "ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.611240 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" (UID: "ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.664634 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9z4k8" event={"ID":"18c173d1-798c-4f98-bfe3-0251b7a19403","Type":"ContainerStarted","Data":"33ecbbd48426c71757455798136c098c550adb765a2a431477989d38e70ad868"} Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.670560 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt8ph\" (UniqueName: \"kubernetes.io/projected/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-kube-api-access-lt8ph\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.670605 4807 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.670617 4807 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.670629 4807 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.670641 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.672126 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-j86hv" event={"ID":"42be5b49-eca1-4532-8742-3dfbb8a4f910","Type":"ContainerStarted","Data":"235a7f98a3f147776d96a37472ab1760ab0aef5b3abe194ff9122d3ea29c9cf8"} Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.683139 4807 generic.go:334] "Generic (PLEG): container finished" podID="d2148eff-b1b3-45a5-9e9e-0769521c4cb7" containerID="b3cfbd259912c35fa2910204c7d82f4bf04097a96271e35732a88eb864ca2c48" exitCode=0 Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.683249 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" event={"ID":"d2148eff-b1b3-45a5-9e9e-0769521c4cb7","Type":"ContainerDied","Data":"b3cfbd259912c35fa2910204c7d82f4bf04097a96271e35732a88eb864ca2c48"} Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.699260 4807 generic.go:334] "Generic (PLEG): container finished" podID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerID="5d71ed41c67e0ed0ac561b6a856bf866a02ddad283f9c004ba2b6c5703768ec1" exitCode=0 Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.699368 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b","Type":"ContainerDied","Data":"5d71ed41c67e0ed0ac561b6a856bf866a02ddad283f9c004ba2b6c5703768ec1"} Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.699413 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b","Type":"ContainerDied","Data":"496d0ace14e4c03b8068781caf15acf4a6be124847d93cfe74a62ea9542361fa"} Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.699440 4807 scope.go:117] "RemoveContainer" containerID="209e9c29c6bc36bfec8cc3c4d028dde495ae8fb0a350449a9badd2b82a67d037" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.699674 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.702863 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-9z4k8" podStartSLOduration=3.702832012 podStartE2EDuration="3.702832012s" podCreationTimestamp="2025-12-02 20:23:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:23:11.689780843 +0000 UTC m=+1526.990688358" watchObservedRunningTime="2025-12-02 20:23:11.702832012 +0000 UTC m=+1527.003739507" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.728806 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"34c5fea6-3631-42fa-9e77-7b4f1e714aec","Type":"ContainerStarted","Data":"5118cdd885e702f844eae4352be8faeb668a9527da1e03e93b003f0d192cf932"} Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.766418 4807 scope.go:117] "RemoveContainer" containerID="a5993e5be6b251561dfb29bbd31d235e9f97aa5e214cf33897dd70450d334ceb" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.772466 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-config-data" (OuterVolumeSpecName: "config-data") pod "ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" (UID: "ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.773697 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.779626 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" (UID: "ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.811363 4807 scope.go:117] "RemoveContainer" containerID="5d71ed41c67e0ed0ac561b6a856bf866a02ddad283f9c004ba2b6c5703768ec1" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.877453 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:11 crc kubenswrapper[4807]: I1202 20:23:11.982332 4807 scope.go:117] "RemoveContainer" containerID="e9c8f133b651da3a5e86290625a32654018cfbf1e01d9cd3c69dad1b0a5b7f8c" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.061055 4807 scope.go:117] "RemoveContainer" containerID="209e9c29c6bc36bfec8cc3c4d028dde495ae8fb0a350449a9badd2b82a67d037" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.066794 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:23:12 crc kubenswrapper[4807]: E1202 20:23:12.067698 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"209e9c29c6bc36bfec8cc3c4d028dde495ae8fb0a350449a9badd2b82a67d037\": container with ID starting with 209e9c29c6bc36bfec8cc3c4d028dde495ae8fb0a350449a9badd2b82a67d037 not found: ID does not exist" containerID="209e9c29c6bc36bfec8cc3c4d028dde495ae8fb0a350449a9badd2b82a67d037" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.067776 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"209e9c29c6bc36bfec8cc3c4d028dde495ae8fb0a350449a9badd2b82a67d037"} err="failed to get container status \"209e9c29c6bc36bfec8cc3c4d028dde495ae8fb0a350449a9badd2b82a67d037\": rpc error: code = NotFound desc = could not find container \"209e9c29c6bc36bfec8cc3c4d028dde495ae8fb0a350449a9badd2b82a67d037\": container with ID starting with 209e9c29c6bc36bfec8cc3c4d028dde495ae8fb0a350449a9badd2b82a67d037 not found: ID does not exist" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.067814 4807 scope.go:117] "RemoveContainer" containerID="a5993e5be6b251561dfb29bbd31d235e9f97aa5e214cf33897dd70450d334ceb" Dec 02 20:23:12 crc kubenswrapper[4807]: E1202 20:23:12.069157 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5993e5be6b251561dfb29bbd31d235e9f97aa5e214cf33897dd70450d334ceb\": container with ID starting with a5993e5be6b251561dfb29bbd31d235e9f97aa5e214cf33897dd70450d334ceb not found: ID does not exist" containerID="a5993e5be6b251561dfb29bbd31d235e9f97aa5e214cf33897dd70450d334ceb" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.069206 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5993e5be6b251561dfb29bbd31d235e9f97aa5e214cf33897dd70450d334ceb"} err="failed to get container status \"a5993e5be6b251561dfb29bbd31d235e9f97aa5e214cf33897dd70450d334ceb\": rpc error: code = NotFound desc = could not find container \"a5993e5be6b251561dfb29bbd31d235e9f97aa5e214cf33897dd70450d334ceb\": container with ID starting with a5993e5be6b251561dfb29bbd31d235e9f97aa5e214cf33897dd70450d334ceb not found: ID does not exist" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.069245 4807 scope.go:117] "RemoveContainer" containerID="5d71ed41c67e0ed0ac561b6a856bf866a02ddad283f9c004ba2b6c5703768ec1" Dec 02 20:23:12 crc kubenswrapper[4807]: E1202 20:23:12.070327 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d71ed41c67e0ed0ac561b6a856bf866a02ddad283f9c004ba2b6c5703768ec1\": container with ID starting with 5d71ed41c67e0ed0ac561b6a856bf866a02ddad283f9c004ba2b6c5703768ec1 not found: ID does not exist" containerID="5d71ed41c67e0ed0ac561b6a856bf866a02ddad283f9c004ba2b6c5703768ec1" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.070368 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d71ed41c67e0ed0ac561b6a856bf866a02ddad283f9c004ba2b6c5703768ec1"} err="failed to get container status \"5d71ed41c67e0ed0ac561b6a856bf866a02ddad283f9c004ba2b6c5703768ec1\": rpc error: code = NotFound desc = could not find container \"5d71ed41c67e0ed0ac561b6a856bf866a02ddad283f9c004ba2b6c5703768ec1\": container with ID starting with 5d71ed41c67e0ed0ac561b6a856bf866a02ddad283f9c004ba2b6c5703768ec1 not found: ID does not exist" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.070395 4807 scope.go:117] "RemoveContainer" containerID="e9c8f133b651da3a5e86290625a32654018cfbf1e01d9cd3c69dad1b0a5b7f8c" Dec 02 20:23:12 crc kubenswrapper[4807]: E1202 20:23:12.077549 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9c8f133b651da3a5e86290625a32654018cfbf1e01d9cd3c69dad1b0a5b7f8c\": container with ID starting with e9c8f133b651da3a5e86290625a32654018cfbf1e01d9cd3c69dad1b0a5b7f8c not found: ID does not exist" containerID="e9c8f133b651da3a5e86290625a32654018cfbf1e01d9cd3c69dad1b0a5b7f8c" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.077605 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c8f133b651da3a5e86290625a32654018cfbf1e01d9cd3c69dad1b0a5b7f8c"} err="failed to get container status \"e9c8f133b651da3a5e86290625a32654018cfbf1e01d9cd3c69dad1b0a5b7f8c\": rpc error: code = NotFound desc = could not find container \"e9c8f133b651da3a5e86290625a32654018cfbf1e01d9cd3c69dad1b0a5b7f8c\": container with ID starting with e9c8f133b651da3a5e86290625a32654018cfbf1e01d9cd3c69dad1b0a5b7f8c not found: ID does not exist" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.087938 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.105654 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:23:12 crc kubenswrapper[4807]: E1202 20:23:12.106241 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerName="sg-core" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.106263 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerName="sg-core" Dec 02 20:23:12 crc kubenswrapper[4807]: E1202 20:23:12.106293 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerName="proxy-httpd" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.106299 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerName="proxy-httpd" Dec 02 20:23:12 crc kubenswrapper[4807]: E1202 20:23:12.106318 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerName="ceilometer-notification-agent" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.106326 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerName="ceilometer-notification-agent" Dec 02 20:23:12 crc kubenswrapper[4807]: E1202 20:23:12.106353 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerName="ceilometer-central-agent" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.106359 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerName="ceilometer-central-agent" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.106576 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerName="sg-core" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.106596 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerName="ceilometer-notification-agent" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.106607 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerName="ceilometer-central-agent" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.106619 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" containerName="proxy-httpd" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.109671 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.114917 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.115261 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.115422 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.139949 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.290741 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.291089 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.291132 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzdfc\" (UniqueName: \"kubernetes.io/projected/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-kube-api-access-hzdfc\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.291155 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.291184 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-config-data\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.291749 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-scripts\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.291920 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-run-httpd\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.292052 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-log-httpd\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.394124 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-scripts\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.394679 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-run-httpd\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.394737 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-log-httpd\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.394812 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.394846 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.394874 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzdfc\" (UniqueName: \"kubernetes.io/projected/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-kube-api-access-hzdfc\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.394904 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.394926 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-config-data\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.395924 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-run-httpd\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.396245 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-log-httpd\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.402873 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.405301 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-scripts\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.410418 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.416919 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzdfc\" (UniqueName: \"kubernetes.io/projected/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-kube-api-access-hzdfc\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.418863 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-config-data\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.442413 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.738646 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.751473 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-j86hv" event={"ID":"42be5b49-eca1-4532-8742-3dfbb8a4f910","Type":"ContainerStarted","Data":"39bd67727ba39f200c809ece4e6154872c2dacdc584517234a8e1e51448f65a6"} Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.756288 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" event={"ID":"d2148eff-b1b3-45a5-9e9e-0769521c4cb7","Type":"ContainerStarted","Data":"7533ea433989f8f19703bbedb950be9ebc6ced89e5bcd411e1fe4143884a943d"} Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.756450 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.785794 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-j86hv" podStartSLOduration=2.785638608 podStartE2EDuration="2.785638608s" podCreationTimestamp="2025-12-02 20:23:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:23:12.772120296 +0000 UTC m=+1528.073027791" watchObservedRunningTime="2025-12-02 20:23:12.785638608 +0000 UTC m=+1528.086546103" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.799532 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" podStartSLOduration=4.799497371 podStartE2EDuration="4.799497371s" podCreationTimestamp="2025-12-02 20:23:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:23:12.795895344 +0000 UTC m=+1528.096802839" watchObservedRunningTime="2025-12-02 20:23:12.799497371 +0000 UTC m=+1528.100404866" Dec 02 20:23:12 crc kubenswrapper[4807]: I1202 20:23:12.992495 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b" path="/var/lib/kubelet/pods/ef94bcb2-ba0a-4903-9647-c3c8fa1fcd0b/volumes" Dec 02 20:23:13 crc kubenswrapper[4807]: I1202 20:23:13.161976 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 20:23:13 crc kubenswrapper[4807]: I1202 20:23:13.174259 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:23:16 crc kubenswrapper[4807]: I1202 20:23:16.144806 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 20:23:16 crc kubenswrapper[4807]: W1202 20:23:16.433106 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e9a71e7_d4c0_48ce_94bd_e5563864adfa.slice/crio-a0cc5110af2ae31cc80f2eed787494835c038a0bce8a301e38c042513c175492 WatchSource:0}: Error finding container a0cc5110af2ae31cc80f2eed787494835c038a0bce8a301e38c042513c175492: Status 404 returned error can't find the container with id a0cc5110af2ae31cc80f2eed787494835c038a0bce8a301e38c042513c175492 Dec 02 20:23:16 crc kubenswrapper[4807]: I1202 20:23:16.441551 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:23:16 crc kubenswrapper[4807]: I1202 20:23:16.853146 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ad4e44c-eb67-43d9-b503-48692a4688e0","Type":"ContainerStarted","Data":"80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821"} Dec 02 20:23:16 crc kubenswrapper[4807]: I1202 20:23:16.855918 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e9a71e7-d4c0-48ce-94bd-e5563864adfa","Type":"ContainerStarted","Data":"a0cc5110af2ae31cc80f2eed787494835c038a0bce8a301e38c042513c175492"} Dec 02 20:23:16 crc kubenswrapper[4807]: I1202 20:23:16.857272 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9a435bd-d058-4ef5-a457-e23bcccfb168","Type":"ContainerStarted","Data":"09dae8634369357ce6fd869b14fe74fa172d69e1f376fb210b660079c7227c34"} Dec 02 20:23:16 crc kubenswrapper[4807]: I1202 20:23:16.857436 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b9a435bd-d058-4ef5-a457-e23bcccfb168" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://09dae8634369357ce6fd869b14fe74fa172d69e1f376fb210b660079c7227c34" gracePeriod=30 Dec 02 20:23:16 crc kubenswrapper[4807]: I1202 20:23:16.859834 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7142a8b-a137-4559-a955-9544266902a6","Type":"ContainerStarted","Data":"1b4d12802e9c948ee70e4cc3b2a07c755404462b9c230d7dc748893065e6e4e7"} Dec 02 20:23:16 crc kubenswrapper[4807]: I1202 20:23:16.861025 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"34c5fea6-3631-42fa-9e77-7b4f1e714aec","Type":"ContainerStarted","Data":"91f98c155631acf2d9df29ca729301e14f1510fe56bccae37100dede184cda1d"} Dec 02 20:23:16 crc kubenswrapper[4807]: I1202 20:23:16.904691 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.436950881 podStartE2EDuration="8.904662166s" podCreationTimestamp="2025-12-02 20:23:08 +0000 UTC" firstStartedPulling="2025-12-02 20:23:10.448393621 +0000 UTC m=+1525.749301116" lastFinishedPulling="2025-12-02 20:23:15.916104906 +0000 UTC m=+1531.217012401" observedRunningTime="2025-12-02 20:23:16.882545177 +0000 UTC m=+1532.183452682" watchObservedRunningTime="2025-12-02 20:23:16.904662166 +0000 UTC m=+1532.205569661" Dec 02 20:23:16 crc kubenswrapper[4807]: I1202 20:23:16.954250 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.642599287 podStartE2EDuration="8.954209722s" podCreationTimestamp="2025-12-02 20:23:08 +0000 UTC" firstStartedPulling="2025-12-02 20:23:10.602599475 +0000 UTC m=+1525.903506970" lastFinishedPulling="2025-12-02 20:23:15.91420991 +0000 UTC m=+1531.215117405" observedRunningTime="2025-12-02 20:23:16.929800155 +0000 UTC m=+1532.230707650" watchObservedRunningTime="2025-12-02 20:23:16.954209722 +0000 UTC m=+1532.255117217" Dec 02 20:23:17 crc kubenswrapper[4807]: I1202 20:23:17.878828 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0ad4e44c-eb67-43d9-b503-48692a4688e0" containerName="nova-metadata-log" containerID="cri-o://80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821" gracePeriod=30 Dec 02 20:23:17 crc kubenswrapper[4807]: I1202 20:23:17.878830 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ad4e44c-eb67-43d9-b503-48692a4688e0","Type":"ContainerStarted","Data":"3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c"} Dec 02 20:23:17 crc kubenswrapper[4807]: I1202 20:23:17.878947 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0ad4e44c-eb67-43d9-b503-48692a4688e0" containerName="nova-metadata-metadata" containerID="cri-o://3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c" gracePeriod=30 Dec 02 20:23:17 crc kubenswrapper[4807]: I1202 20:23:17.895794 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7142a8b-a137-4559-a955-9544266902a6","Type":"ContainerStarted","Data":"02c1ec521ae711bbed2ec54afee6650b03a4a822eb59cd3e9fa6c9240bd1aec3"} Dec 02 20:23:17 crc kubenswrapper[4807]: I1202 20:23:17.922554 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.408202705 podStartE2EDuration="9.922518058s" podCreationTimestamp="2025-12-02 20:23:08 +0000 UTC" firstStartedPulling="2025-12-02 20:23:10.402661569 +0000 UTC m=+1525.703569064" lastFinishedPulling="2025-12-02 20:23:15.916976912 +0000 UTC m=+1531.217884417" observedRunningTime="2025-12-02 20:23:17.89976828 +0000 UTC m=+1533.200675775" watchObservedRunningTime="2025-12-02 20:23:17.922518058 +0000 UTC m=+1533.223425553" Dec 02 20:23:17 crc kubenswrapper[4807]: I1202 20:23:17.936007 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.093306724 podStartE2EDuration="9.935968129s" podCreationTimestamp="2025-12-02 20:23:08 +0000 UTC" firstStartedPulling="2025-12-02 20:23:10.071573435 +0000 UTC m=+1525.372480930" lastFinishedPulling="2025-12-02 20:23:15.91423484 +0000 UTC m=+1531.215142335" observedRunningTime="2025-12-02 20:23:17.925750344 +0000 UTC m=+1533.226657839" watchObservedRunningTime="2025-12-02 20:23:17.935968129 +0000 UTC m=+1533.236875634" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.449414 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.603918 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad4e44c-eb67-43d9-b503-48692a4688e0-logs\") pod \"0ad4e44c-eb67-43d9-b503-48692a4688e0\" (UID: \"0ad4e44c-eb67-43d9-b503-48692a4688e0\") " Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.604211 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad4e44c-eb67-43d9-b503-48692a4688e0-combined-ca-bundle\") pod \"0ad4e44c-eb67-43d9-b503-48692a4688e0\" (UID: \"0ad4e44c-eb67-43d9-b503-48692a4688e0\") " Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.604263 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad4e44c-eb67-43d9-b503-48692a4688e0-config-data\") pod \"0ad4e44c-eb67-43d9-b503-48692a4688e0\" (UID: \"0ad4e44c-eb67-43d9-b503-48692a4688e0\") " Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.604307 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c274t\" (UniqueName: \"kubernetes.io/projected/0ad4e44c-eb67-43d9-b503-48692a4688e0-kube-api-access-c274t\") pod \"0ad4e44c-eb67-43d9-b503-48692a4688e0\" (UID: \"0ad4e44c-eb67-43d9-b503-48692a4688e0\") " Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.605343 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ad4e44c-eb67-43d9-b503-48692a4688e0-logs" (OuterVolumeSpecName: "logs") pod "0ad4e44c-eb67-43d9-b503-48692a4688e0" (UID: "0ad4e44c-eb67-43d9-b503-48692a4688e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.612644 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ad4e44c-eb67-43d9-b503-48692a4688e0-kube-api-access-c274t" (OuterVolumeSpecName: "kube-api-access-c274t") pod "0ad4e44c-eb67-43d9-b503-48692a4688e0" (UID: "0ad4e44c-eb67-43d9-b503-48692a4688e0"). InnerVolumeSpecName "kube-api-access-c274t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.638784 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ad4e44c-eb67-43d9-b503-48692a4688e0-config-data" (OuterVolumeSpecName: "config-data") pod "0ad4e44c-eb67-43d9-b503-48692a4688e0" (UID: "0ad4e44c-eb67-43d9-b503-48692a4688e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.652515 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ad4e44c-eb67-43d9-b503-48692a4688e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ad4e44c-eb67-43d9-b503-48692a4688e0" (UID: "0ad4e44c-eb67-43d9-b503-48692a4688e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.708276 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ad4e44c-eb67-43d9-b503-48692a4688e0-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.708322 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ad4e44c-eb67-43d9-b503-48692a4688e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.708341 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ad4e44c-eb67-43d9-b503-48692a4688e0-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.708351 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c274t\" (UniqueName: \"kubernetes.io/projected/0ad4e44c-eb67-43d9-b503-48692a4688e0-kube-api-access-c274t\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.910826 4807 generic.go:334] "Generic (PLEG): container finished" podID="0ad4e44c-eb67-43d9-b503-48692a4688e0" containerID="3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c" exitCode=0 Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.910872 4807 generic.go:334] "Generic (PLEG): container finished" podID="0ad4e44c-eb67-43d9-b503-48692a4688e0" containerID="80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821" exitCode=143 Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.910925 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ad4e44c-eb67-43d9-b503-48692a4688e0","Type":"ContainerDied","Data":"3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c"} Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.910994 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ad4e44c-eb67-43d9-b503-48692a4688e0","Type":"ContainerDied","Data":"80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821"} Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.911014 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0ad4e44c-eb67-43d9-b503-48692a4688e0","Type":"ContainerDied","Data":"f05a31f263154b91e424a079375d9dc29ac640b4d44289535a77c022b94ec88c"} Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.911040 4807 scope.go:117] "RemoveContainer" containerID="3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.912754 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.914810 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e9a71e7-d4c0-48ce-94bd-e5563864adfa","Type":"ContainerStarted","Data":"c48d220d1a579e4d460d431b3f821e6ca2f1ef17bf31930261c852593e07c3b4"} Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.940137 4807 scope.go:117] "RemoveContainer" containerID="80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.968224 4807 scope.go:117] "RemoveContainer" containerID="3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c" Dec 02 20:23:18 crc kubenswrapper[4807]: E1202 20:23:18.968815 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c\": container with ID starting with 3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c not found: ID does not exist" containerID="3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.968852 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c"} err="failed to get container status \"3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c\": rpc error: code = NotFound desc = could not find container \"3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c\": container with ID starting with 3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c not found: ID does not exist" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.968879 4807 scope.go:117] "RemoveContainer" containerID="80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821" Dec 02 20:23:18 crc kubenswrapper[4807]: E1202 20:23:18.969240 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821\": container with ID starting with 80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821 not found: ID does not exist" containerID="80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.969312 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821"} err="failed to get container status \"80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821\": rpc error: code = NotFound desc = could not find container \"80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821\": container with ID starting with 80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821 not found: ID does not exist" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.969358 4807 scope.go:117] "RemoveContainer" containerID="3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.969799 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c"} err="failed to get container status \"3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c\": rpc error: code = NotFound desc = could not find container \"3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c\": container with ID starting with 3481b4ec0539581b80dc900aadbbd8f623e5ceca40b38ffb49dae1b643f70b5c not found: ID does not exist" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.969822 4807 scope.go:117] "RemoveContainer" containerID="80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.970062 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821"} err="failed to get container status \"80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821\": rpc error: code = NotFound desc = could not find container \"80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821\": container with ID starting with 80b34e3a8cc307722f55b6b9a4988a57ce2d807a7528457aee5c0ea659e7a821 not found: ID does not exist" Dec 02 20:23:18 crc kubenswrapper[4807]: I1202 20:23:18.984102 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.003370 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.045959 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.046029 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.074650 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:23:19 crc kubenswrapper[4807]: E1202 20:23:19.076441 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ad4e44c-eb67-43d9-b503-48692a4688e0" containerName="nova-metadata-metadata" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.076562 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad4e44c-eb67-43d9-b503-48692a4688e0" containerName="nova-metadata-metadata" Dec 02 20:23:19 crc kubenswrapper[4807]: E1202 20:23:19.076632 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ad4e44c-eb67-43d9-b503-48692a4688e0" containerName="nova-metadata-log" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.076643 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ad4e44c-eb67-43d9-b503-48692a4688e0" containerName="nova-metadata-log" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.076951 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ad4e44c-eb67-43d9-b503-48692a4688e0" containerName="nova-metadata-log" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.076983 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ad4e44c-eb67-43d9-b503-48692a4688e0" containerName="nova-metadata-metadata" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.083712 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.088211 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.088426 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.112228 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.225051 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk57j\" (UniqueName: \"kubernetes.io/projected/25d52d5e-fc8e-48bc-bb96-2a3933035103-kube-api-access-dk57j\") pod \"nova-metadata-0\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.225102 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.225127 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-config-data\") pod \"nova-metadata-0\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.225774 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.225908 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d52d5e-fc8e-48bc-bb96-2a3933035103-logs\") pod \"nova-metadata-0\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.327951 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d52d5e-fc8e-48bc-bb96-2a3933035103-logs\") pod \"nova-metadata-0\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.328110 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk57j\" (UniqueName: \"kubernetes.io/projected/25d52d5e-fc8e-48bc-bb96-2a3933035103-kube-api-access-dk57j\") pod \"nova-metadata-0\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.328148 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.328198 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-config-data\") pod \"nova-metadata-0\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.328434 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.329281 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d52d5e-fc8e-48bc-bb96-2a3933035103-logs\") pod \"nova-metadata-0\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.337066 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-config-data\") pod \"nova-metadata-0\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.337096 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.338004 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.354402 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk57j\" (UniqueName: \"kubernetes.io/projected/25d52d5e-fc8e-48bc-bb96-2a3933035103-kube-api-access-dk57j\") pod \"nova-metadata-0\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.360178 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.537989 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.543395 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.655671 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-fdw8d"] Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.656523 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" podUID="61a9f7ef-7bd0-49c9-8cd7-3eca252dca72" containerName="dnsmasq-dns" containerID="cri-o://d16103684b5bbaddf734cc78a6c310b6bb8f9eb58d50f341e5a85c8f8040d15d" gracePeriod=10 Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.719936 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.720005 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.817111 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.981283 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e9a71e7-d4c0-48ce-94bd-e5563864adfa","Type":"ContainerStarted","Data":"f4ea173ca5e5f4f20e07bd31443d4f2627e605f815e89547782994797d03bd4e"} Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.997138 4807 generic.go:334] "Generic (PLEG): container finished" podID="61a9f7ef-7bd0-49c9-8cd7-3eca252dca72" containerID="d16103684b5bbaddf734cc78a6c310b6bb8f9eb58d50f341e5a85c8f8040d15d" exitCode=0 Dec 02 20:23:19 crc kubenswrapper[4807]: I1202 20:23:19.997863 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" event={"ID":"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72","Type":"ContainerDied","Data":"d16103684b5bbaddf734cc78a6c310b6bb8f9eb58d50f341e5a85c8f8040d15d"} Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.121199 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.164881 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e7142a8b-a137-4559-a955-9544266902a6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.165041 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e7142a8b-a137-4559-a955-9544266902a6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.230787 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.644306 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.715765 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-dns-swift-storage-0\") pod \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.809114 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "61a9f7ef-7bd0-49c9-8cd7-3eca252dca72" (UID: "61a9f7ef-7bd0-49c9-8cd7-3eca252dca72"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.821459 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtpcf\" (UniqueName: \"kubernetes.io/projected/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-kube-api-access-jtpcf\") pod \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.821645 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-ovsdbserver-sb\") pod \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.821689 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-dns-svc\") pod \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.821758 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-config\") pod \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.821794 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-ovsdbserver-nb\") pod \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\" (UID: \"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72\") " Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.822166 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.877333 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-kube-api-access-jtpcf" (OuterVolumeSpecName: "kube-api-access-jtpcf") pod "61a9f7ef-7bd0-49c9-8cd7-3eca252dca72" (UID: "61a9f7ef-7bd0-49c9-8cd7-3eca252dca72"). InnerVolumeSpecName "kube-api-access-jtpcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.942959 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "61a9f7ef-7bd0-49c9-8cd7-3eca252dca72" (UID: "61a9f7ef-7bd0-49c9-8cd7-3eca252dca72"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.957861 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.957908 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtpcf\" (UniqueName: \"kubernetes.io/projected/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-kube-api-access-jtpcf\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.962071 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "61a9f7ef-7bd0-49c9-8cd7-3eca252dca72" (UID: "61a9f7ef-7bd0-49c9-8cd7-3eca252dca72"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:23:20 crc kubenswrapper[4807]: I1202 20:23:20.992151 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "61a9f7ef-7bd0-49c9-8cd7-3eca252dca72" (UID: "61a9f7ef-7bd0-49c9-8cd7-3eca252dca72"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:23:21 crc kubenswrapper[4807]: I1202 20:23:21.036474 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-config" (OuterVolumeSpecName: "config") pod "61a9f7ef-7bd0-49c9-8cd7-3eca252dca72" (UID: "61a9f7ef-7bd0-49c9-8cd7-3eca252dca72"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:23:21 crc kubenswrapper[4807]: I1202 20:23:21.038088 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ad4e44c-eb67-43d9-b503-48692a4688e0" path="/var/lib/kubelet/pods/0ad4e44c-eb67-43d9-b503-48692a4688e0/volumes" Dec 02 20:23:21 crc kubenswrapper[4807]: I1202 20:23:21.041581 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" event={"ID":"61a9f7ef-7bd0-49c9-8cd7-3eca252dca72","Type":"ContainerDied","Data":"bb3050f5926637bacffca6fc1d6709a0b4e3e23413d30ec6fe61e1511060a19c"} Dec 02 20:23:21 crc kubenswrapper[4807]: I1202 20:23:21.041661 4807 scope.go:117] "RemoveContainer" containerID="d16103684b5bbaddf734cc78a6c310b6bb8f9eb58d50f341e5a85c8f8040d15d" Dec 02 20:23:21 crc kubenswrapper[4807]: I1202 20:23:21.044502 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-fdw8d" Dec 02 20:23:21 crc kubenswrapper[4807]: I1202 20:23:21.060225 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:21 crc kubenswrapper[4807]: I1202 20:23:21.060272 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:21 crc kubenswrapper[4807]: I1202 20:23:21.060289 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:21 crc kubenswrapper[4807]: I1202 20:23:21.071160 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"25d52d5e-fc8e-48bc-bb96-2a3933035103","Type":"ContainerStarted","Data":"20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c"} Dec 02 20:23:21 crc kubenswrapper[4807]: I1202 20:23:21.071402 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"25d52d5e-fc8e-48bc-bb96-2a3933035103","Type":"ContainerStarted","Data":"19bde7c55e1429c240160b4198ad1eab325d437695bff48ff8ec8345dc31d38f"} Dec 02 20:23:21 crc kubenswrapper[4807]: I1202 20:23:21.081386 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e9a71e7-d4c0-48ce-94bd-e5563864adfa","Type":"ContainerStarted","Data":"86ce352ca57a0e1c13ded632c42c78b3fec7d98cb0a269e4c4f8386848af1d5b"} Dec 02 20:23:21 crc kubenswrapper[4807]: I1202 20:23:21.122583 4807 scope.go:117] "RemoveContainer" containerID="11d5f7249a4d1bbb33484216e73637e9898513f7314137f7d0e766e6e40fd147" Dec 02 20:23:21 crc kubenswrapper[4807]: I1202 20:23:21.140480 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-fdw8d"] Dec 02 20:23:21 crc kubenswrapper[4807]: I1202 20:23:21.148748 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-fdw8d"] Dec 02 20:23:22 crc kubenswrapper[4807]: I1202 20:23:22.094588 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"25d52d5e-fc8e-48bc-bb96-2a3933035103","Type":"ContainerStarted","Data":"624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce"} Dec 02 20:23:22 crc kubenswrapper[4807]: I1202 20:23:22.100626 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e9a71e7-d4c0-48ce-94bd-e5563864adfa","Type":"ContainerStarted","Data":"cec519dee2c86c2e9ee18316c11005659f23b98aa15463ba5e587aea3aa85aaf"} Dec 02 20:23:22 crc kubenswrapper[4807]: I1202 20:23:22.102335 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 20:23:22 crc kubenswrapper[4807]: I1202 20:23:22.126636 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.12660258 podStartE2EDuration="4.12660258s" podCreationTimestamp="2025-12-02 20:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:23:22.116796968 +0000 UTC m=+1537.417704473" watchObservedRunningTime="2025-12-02 20:23:22.12660258 +0000 UTC m=+1537.427510095" Dec 02 20:23:22 crc kubenswrapper[4807]: I1202 20:23:22.162007 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.887695501 podStartE2EDuration="10.161942373s" podCreationTimestamp="2025-12-02 20:23:12 +0000 UTC" firstStartedPulling="2025-12-02 20:23:16.436736846 +0000 UTC m=+1531.737644341" lastFinishedPulling="2025-12-02 20:23:21.710983718 +0000 UTC m=+1537.011891213" observedRunningTime="2025-12-02 20:23:22.146857063 +0000 UTC m=+1537.447764568" watchObservedRunningTime="2025-12-02 20:23:22.161942373 +0000 UTC m=+1537.462849878" Dec 02 20:23:22 crc kubenswrapper[4807]: I1202 20:23:22.990853 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61a9f7ef-7bd0-49c9-8cd7-3eca252dca72" path="/var/lib/kubelet/pods/61a9f7ef-7bd0-49c9-8cd7-3eca252dca72/volumes" Dec 02 20:23:23 crc kubenswrapper[4807]: I1202 20:23:23.123493 4807 generic.go:334] "Generic (PLEG): container finished" podID="18c173d1-798c-4f98-bfe3-0251b7a19403" containerID="33ecbbd48426c71757455798136c098c550adb765a2a431477989d38e70ad868" exitCode=0 Dec 02 20:23:23 crc kubenswrapper[4807]: I1202 20:23:23.124817 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9z4k8" event={"ID":"18c173d1-798c-4f98-bfe3-0251b7a19403","Type":"ContainerDied","Data":"33ecbbd48426c71757455798136c098c550adb765a2a431477989d38e70ad868"} Dec 02 20:23:24 crc kubenswrapper[4807]: I1202 20:23:24.544549 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 20:23:24 crc kubenswrapper[4807]: I1202 20:23:24.545497 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 20:23:24 crc kubenswrapper[4807]: I1202 20:23:24.581081 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9z4k8" Dec 02 20:23:24 crc kubenswrapper[4807]: I1202 20:23:24.649441 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-combined-ca-bundle\") pod \"18c173d1-798c-4f98-bfe3-0251b7a19403\" (UID: \"18c173d1-798c-4f98-bfe3-0251b7a19403\") " Dec 02 20:23:24 crc kubenswrapper[4807]: I1202 20:23:24.649657 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xc45\" (UniqueName: \"kubernetes.io/projected/18c173d1-798c-4f98-bfe3-0251b7a19403-kube-api-access-5xc45\") pod \"18c173d1-798c-4f98-bfe3-0251b7a19403\" (UID: \"18c173d1-798c-4f98-bfe3-0251b7a19403\") " Dec 02 20:23:24 crc kubenswrapper[4807]: I1202 20:23:24.649758 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-scripts\") pod \"18c173d1-798c-4f98-bfe3-0251b7a19403\" (UID: \"18c173d1-798c-4f98-bfe3-0251b7a19403\") " Dec 02 20:23:24 crc kubenswrapper[4807]: I1202 20:23:24.649815 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-config-data\") pod \"18c173d1-798c-4f98-bfe3-0251b7a19403\" (UID: \"18c173d1-798c-4f98-bfe3-0251b7a19403\") " Dec 02 20:23:24 crc kubenswrapper[4807]: I1202 20:23:24.661908 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-scripts" (OuterVolumeSpecName: "scripts") pod "18c173d1-798c-4f98-bfe3-0251b7a19403" (UID: "18c173d1-798c-4f98-bfe3-0251b7a19403"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:24 crc kubenswrapper[4807]: I1202 20:23:24.661945 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c173d1-798c-4f98-bfe3-0251b7a19403-kube-api-access-5xc45" (OuterVolumeSpecName: "kube-api-access-5xc45") pod "18c173d1-798c-4f98-bfe3-0251b7a19403" (UID: "18c173d1-798c-4f98-bfe3-0251b7a19403"). InnerVolumeSpecName "kube-api-access-5xc45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:23:24 crc kubenswrapper[4807]: I1202 20:23:24.697950 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-config-data" (OuterVolumeSpecName: "config-data") pod "18c173d1-798c-4f98-bfe3-0251b7a19403" (UID: "18c173d1-798c-4f98-bfe3-0251b7a19403"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:24 crc kubenswrapper[4807]: I1202 20:23:24.702795 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18c173d1-798c-4f98-bfe3-0251b7a19403" (UID: "18c173d1-798c-4f98-bfe3-0251b7a19403"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:24 crc kubenswrapper[4807]: I1202 20:23:24.754335 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xc45\" (UniqueName: \"kubernetes.io/projected/18c173d1-798c-4f98-bfe3-0251b7a19403-kube-api-access-5xc45\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:24 crc kubenswrapper[4807]: I1202 20:23:24.754379 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:24 crc kubenswrapper[4807]: I1202 20:23:24.754390 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:24 crc kubenswrapper[4807]: I1202 20:23:24.754399 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c173d1-798c-4f98-bfe3-0251b7a19403-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:25 crc kubenswrapper[4807]: I1202 20:23:25.149984 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9z4k8" event={"ID":"18c173d1-798c-4f98-bfe3-0251b7a19403","Type":"ContainerDied","Data":"96fdf9c938ed1d88a87bce9e40f5265a73c6bea2e554b259cce94eb0e393d554"} Dec 02 20:23:25 crc kubenswrapper[4807]: I1202 20:23:25.150515 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96fdf9c938ed1d88a87bce9e40f5265a73c6bea2e554b259cce94eb0e393d554" Dec 02 20:23:25 crc kubenswrapper[4807]: I1202 20:23:25.150070 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9z4k8" Dec 02 20:23:25 crc kubenswrapper[4807]: I1202 20:23:25.166820 4807 generic.go:334] "Generic (PLEG): container finished" podID="42be5b49-eca1-4532-8742-3dfbb8a4f910" containerID="39bd67727ba39f200c809ece4e6154872c2dacdc584517234a8e1e51448f65a6" exitCode=0 Dec 02 20:23:25 crc kubenswrapper[4807]: I1202 20:23:25.166926 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-j86hv" event={"ID":"42be5b49-eca1-4532-8742-3dfbb8a4f910","Type":"ContainerDied","Data":"39bd67727ba39f200c809ece4e6154872c2dacdc584517234a8e1e51448f65a6"} Dec 02 20:23:25 crc kubenswrapper[4807]: I1202 20:23:25.364773 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:23:25 crc kubenswrapper[4807]: I1202 20:23:25.365863 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e7142a8b-a137-4559-a955-9544266902a6" containerName="nova-api-api" containerID="cri-o://02c1ec521ae711bbed2ec54afee6650b03a4a822eb59cd3e9fa6c9240bd1aec3" gracePeriod=30 Dec 02 20:23:25 crc kubenswrapper[4807]: I1202 20:23:25.365790 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e7142a8b-a137-4559-a955-9544266902a6" containerName="nova-api-log" containerID="cri-o://1b4d12802e9c948ee70e4cc3b2a07c755404462b9c230d7dc748893065e6e4e7" gracePeriod=30 Dec 02 20:23:25 crc kubenswrapper[4807]: I1202 20:23:25.385593 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 20:23:25 crc kubenswrapper[4807]: I1202 20:23:25.386043 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="34c5fea6-3631-42fa-9e77-7b4f1e714aec" containerName="nova-scheduler-scheduler" containerID="cri-o://91f98c155631acf2d9df29ca729301e14f1510fe56bccae37100dede184cda1d" gracePeriod=30 Dec 02 20:23:25 crc kubenswrapper[4807]: I1202 20:23:25.436690 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:23:25 crc kubenswrapper[4807]: I1202 20:23:25.437017 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="25d52d5e-fc8e-48bc-bb96-2a3933035103" containerName="nova-metadata-log" containerID="cri-o://20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c" gracePeriod=30 Dec 02 20:23:25 crc kubenswrapper[4807]: I1202 20:23:25.437732 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="25d52d5e-fc8e-48bc-bb96-2a3933035103" containerName="nova-metadata-metadata" containerID="cri-o://624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce" gracePeriod=30 Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.104315 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.189212 4807 generic.go:334] "Generic (PLEG): container finished" podID="25d52d5e-fc8e-48bc-bb96-2a3933035103" containerID="624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce" exitCode=0 Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.189271 4807 generic.go:334] "Generic (PLEG): container finished" podID="25d52d5e-fc8e-48bc-bb96-2a3933035103" containerID="20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c" exitCode=143 Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.189356 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"25d52d5e-fc8e-48bc-bb96-2a3933035103","Type":"ContainerDied","Data":"624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce"} Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.189414 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"25d52d5e-fc8e-48bc-bb96-2a3933035103","Type":"ContainerDied","Data":"20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c"} Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.189438 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"25d52d5e-fc8e-48bc-bb96-2a3933035103","Type":"ContainerDied","Data":"19bde7c55e1429c240160b4198ad1eab325d437695bff48ff8ec8345dc31d38f"} Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.189466 4807 scope.go:117] "RemoveContainer" containerID="624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.189703 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.194639 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-combined-ca-bundle\") pod \"25d52d5e-fc8e-48bc-bb96-2a3933035103\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.194679 4807 generic.go:334] "Generic (PLEG): container finished" podID="e7142a8b-a137-4559-a955-9544266902a6" containerID="1b4d12802e9c948ee70e4cc3b2a07c755404462b9c230d7dc748893065e6e4e7" exitCode=143 Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.194807 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-nova-metadata-tls-certs\") pod \"25d52d5e-fc8e-48bc-bb96-2a3933035103\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.194852 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-config-data\") pod \"25d52d5e-fc8e-48bc-bb96-2a3933035103\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.194900 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7142a8b-a137-4559-a955-9544266902a6","Type":"ContainerDied","Data":"1b4d12802e9c948ee70e4cc3b2a07c755404462b9c230d7dc748893065e6e4e7"} Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.194975 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d52d5e-fc8e-48bc-bb96-2a3933035103-logs\") pod \"25d52d5e-fc8e-48bc-bb96-2a3933035103\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.195151 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk57j\" (UniqueName: \"kubernetes.io/projected/25d52d5e-fc8e-48bc-bb96-2a3933035103-kube-api-access-dk57j\") pod \"25d52d5e-fc8e-48bc-bb96-2a3933035103\" (UID: \"25d52d5e-fc8e-48bc-bb96-2a3933035103\") " Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.195661 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d52d5e-fc8e-48bc-bb96-2a3933035103-logs" (OuterVolumeSpecName: "logs") pod "25d52d5e-fc8e-48bc-bb96-2a3933035103" (UID: "25d52d5e-fc8e-48bc-bb96-2a3933035103"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.220143 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d52d5e-fc8e-48bc-bb96-2a3933035103-kube-api-access-dk57j" (OuterVolumeSpecName: "kube-api-access-dk57j") pod "25d52d5e-fc8e-48bc-bb96-2a3933035103" (UID: "25d52d5e-fc8e-48bc-bb96-2a3933035103"). InnerVolumeSpecName "kube-api-access-dk57j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.234035 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25d52d5e-fc8e-48bc-bb96-2a3933035103" (UID: "25d52d5e-fc8e-48bc-bb96-2a3933035103"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.237779 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-config-data" (OuterVolumeSpecName: "config-data") pod "25d52d5e-fc8e-48bc-bb96-2a3933035103" (UID: "25d52d5e-fc8e-48bc-bb96-2a3933035103"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.248370 4807 scope.go:117] "RemoveContainer" containerID="20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.259710 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "25d52d5e-fc8e-48bc-bb96-2a3933035103" (UID: "25d52d5e-fc8e-48bc-bb96-2a3933035103"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.302030 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.302094 4807 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.302111 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25d52d5e-fc8e-48bc-bb96-2a3933035103-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.302129 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25d52d5e-fc8e-48bc-bb96-2a3933035103-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.302140 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk57j\" (UniqueName: \"kubernetes.io/projected/25d52d5e-fc8e-48bc-bb96-2a3933035103-kube-api-access-dk57j\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.311961 4807 scope.go:117] "RemoveContainer" containerID="624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce" Dec 02 20:23:26 crc kubenswrapper[4807]: E1202 20:23:26.312872 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce\": container with ID starting with 624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce not found: ID does not exist" containerID="624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.312959 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce"} err="failed to get container status \"624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce\": rpc error: code = NotFound desc = could not find container \"624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce\": container with ID starting with 624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce not found: ID does not exist" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.312997 4807 scope.go:117] "RemoveContainer" containerID="20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c" Dec 02 20:23:26 crc kubenswrapper[4807]: E1202 20:23:26.313984 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c\": container with ID starting with 20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c not found: ID does not exist" containerID="20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.314044 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c"} err="failed to get container status \"20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c\": rpc error: code = NotFound desc = could not find container \"20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c\": container with ID starting with 20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c not found: ID does not exist" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.314073 4807 scope.go:117] "RemoveContainer" containerID="624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.315746 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce"} err="failed to get container status \"624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce\": rpc error: code = NotFound desc = could not find container \"624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce\": container with ID starting with 624b6ed188906209c70bb629215d1e9b75988b8c024b95c189491193e28ab4ce not found: ID does not exist" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.315773 4807 scope.go:117] "RemoveContainer" containerID="20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.316477 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c"} err="failed to get container status \"20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c\": rpc error: code = NotFound desc = could not find container \"20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c\": container with ID starting with 20558d57cd79dd79f2ed058f7d3a55c7f32fa4299a633e05435e18e70b5cbc3c not found: ID does not exist" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.548779 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.560906 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.579187 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:23:26 crc kubenswrapper[4807]: E1202 20:23:26.580494 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c173d1-798c-4f98-bfe3-0251b7a19403" containerName="nova-manage" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.580517 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c173d1-798c-4f98-bfe3-0251b7a19403" containerName="nova-manage" Dec 02 20:23:26 crc kubenswrapper[4807]: E1202 20:23:26.580568 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a9f7ef-7bd0-49c9-8cd7-3eca252dca72" containerName="init" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.580577 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a9f7ef-7bd0-49c9-8cd7-3eca252dca72" containerName="init" Dec 02 20:23:26 crc kubenswrapper[4807]: E1202 20:23:26.580595 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d52d5e-fc8e-48bc-bb96-2a3933035103" containerName="nova-metadata-log" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.580602 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d52d5e-fc8e-48bc-bb96-2a3933035103" containerName="nova-metadata-log" Dec 02 20:23:26 crc kubenswrapper[4807]: E1202 20:23:26.580847 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a9f7ef-7bd0-49c9-8cd7-3eca252dca72" containerName="dnsmasq-dns" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.580855 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a9f7ef-7bd0-49c9-8cd7-3eca252dca72" containerName="dnsmasq-dns" Dec 02 20:23:26 crc kubenswrapper[4807]: E1202 20:23:26.580869 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d52d5e-fc8e-48bc-bb96-2a3933035103" containerName="nova-metadata-metadata" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.581009 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d52d5e-fc8e-48bc-bb96-2a3933035103" containerName="nova-metadata-metadata" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.587137 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a9f7ef-7bd0-49c9-8cd7-3eca252dca72" containerName="dnsmasq-dns" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.587223 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d52d5e-fc8e-48bc-bb96-2a3933035103" containerName="nova-metadata-log" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.587257 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d52d5e-fc8e-48bc-bb96-2a3933035103" containerName="nova-metadata-metadata" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.587276 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c173d1-798c-4f98-bfe3-0251b7a19403" containerName="nova-manage" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.589064 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.594109 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.594431 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.601723 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.675253 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-j86hv" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.713566 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-config-data\") pod \"nova-metadata-0\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.713650 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khks6\" (UniqueName: \"kubernetes.io/projected/93b166c0-bf72-4774-97dc-f22b3b02f15a-kube-api-access-khks6\") pod \"nova-metadata-0\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.713708 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.714085 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.714227 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93b166c0-bf72-4774-97dc-f22b3b02f15a-logs\") pod \"nova-metadata-0\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.816143 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-config-data\") pod \"42be5b49-eca1-4532-8742-3dfbb8a4f910\" (UID: \"42be5b49-eca1-4532-8742-3dfbb8a4f910\") " Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.816252 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-scripts\") pod \"42be5b49-eca1-4532-8742-3dfbb8a4f910\" (UID: \"42be5b49-eca1-4532-8742-3dfbb8a4f910\") " Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.816523 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2plrk\" (UniqueName: \"kubernetes.io/projected/42be5b49-eca1-4532-8742-3dfbb8a4f910-kube-api-access-2plrk\") pod \"42be5b49-eca1-4532-8742-3dfbb8a4f910\" (UID: \"42be5b49-eca1-4532-8742-3dfbb8a4f910\") " Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.816565 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-combined-ca-bundle\") pod \"42be5b49-eca1-4532-8742-3dfbb8a4f910\" (UID: \"42be5b49-eca1-4532-8742-3dfbb8a4f910\") " Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.817059 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.817157 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93b166c0-bf72-4774-97dc-f22b3b02f15a-logs\") pod \"nova-metadata-0\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.817251 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-config-data\") pod \"nova-metadata-0\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.817314 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khks6\" (UniqueName: \"kubernetes.io/projected/93b166c0-bf72-4774-97dc-f22b3b02f15a-kube-api-access-khks6\") pod \"nova-metadata-0\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.817376 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.818601 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93b166c0-bf72-4774-97dc-f22b3b02f15a-logs\") pod \"nova-metadata-0\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.822664 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.823216 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42be5b49-eca1-4532-8742-3dfbb8a4f910-kube-api-access-2plrk" (OuterVolumeSpecName: "kube-api-access-2plrk") pod "42be5b49-eca1-4532-8742-3dfbb8a4f910" (UID: "42be5b49-eca1-4532-8742-3dfbb8a4f910"). InnerVolumeSpecName "kube-api-access-2plrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.827008 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-config-data\") pod \"nova-metadata-0\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.828999 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.829243 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-scripts" (OuterVolumeSpecName: "scripts") pod "42be5b49-eca1-4532-8742-3dfbb8a4f910" (UID: "42be5b49-eca1-4532-8742-3dfbb8a4f910"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.842443 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khks6\" (UniqueName: \"kubernetes.io/projected/93b166c0-bf72-4774-97dc-f22b3b02f15a-kube-api-access-khks6\") pod \"nova-metadata-0\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.861893 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-config-data" (OuterVolumeSpecName: "config-data") pod "42be5b49-eca1-4532-8742-3dfbb8a4f910" (UID: "42be5b49-eca1-4532-8742-3dfbb8a4f910"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.863524 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42be5b49-eca1-4532-8742-3dfbb8a4f910" (UID: "42be5b49-eca1-4532-8742-3dfbb8a4f910"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.920310 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2plrk\" (UniqueName: \"kubernetes.io/projected/42be5b49-eca1-4532-8742-3dfbb8a4f910-kube-api-access-2plrk\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.920363 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.920381 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.920398 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42be5b49-eca1-4532-8742-3dfbb8a4f910-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.987102 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 20:23:26 crc kubenswrapper[4807]: I1202 20:23:26.994495 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d52d5e-fc8e-48bc-bb96-2a3933035103" path="/var/lib/kubelet/pods/25d52d5e-fc8e-48bc-bb96-2a3933035103/volumes" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.227250 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-j86hv" event={"ID":"42be5b49-eca1-4532-8742-3dfbb8a4f910","Type":"ContainerDied","Data":"235a7f98a3f147776d96a37472ab1760ab0aef5b3abe194ff9122d3ea29c9cf8"} Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.227710 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="235a7f98a3f147776d96a37472ab1760ab0aef5b3abe194ff9122d3ea29c9cf8" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.227416 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-j86hv" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.294955 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 20:23:27 crc kubenswrapper[4807]: E1202 20:23:27.295816 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42be5b49-eca1-4532-8742-3dfbb8a4f910" containerName="nova-cell1-conductor-db-sync" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.295836 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="42be5b49-eca1-4532-8742-3dfbb8a4f910" containerName="nova-cell1-conductor-db-sync" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.296160 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="42be5b49-eca1-4532-8742-3dfbb8a4f910" containerName="nova-cell1-conductor-db-sync" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.297645 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.302202 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.309059 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.431494 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5860289-2a92-47f1-855c-399a8c590f7f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c5860289-2a92-47f1-855c-399a8c590f7f\") " pod="openstack/nova-cell1-conductor-0" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.431692 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cd5q\" (UniqueName: \"kubernetes.io/projected/c5860289-2a92-47f1-855c-399a8c590f7f-kube-api-access-6cd5q\") pod \"nova-cell1-conductor-0\" (UID: \"c5860289-2a92-47f1-855c-399a8c590f7f\") " pod="openstack/nova-cell1-conductor-0" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.431804 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5860289-2a92-47f1-855c-399a8c590f7f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c5860289-2a92-47f1-855c-399a8c590f7f\") " pod="openstack/nova-cell1-conductor-0" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.480121 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.534588 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5860289-2a92-47f1-855c-399a8c590f7f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c5860289-2a92-47f1-855c-399a8c590f7f\") " pod="openstack/nova-cell1-conductor-0" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.534826 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cd5q\" (UniqueName: \"kubernetes.io/projected/c5860289-2a92-47f1-855c-399a8c590f7f-kube-api-access-6cd5q\") pod \"nova-cell1-conductor-0\" (UID: \"c5860289-2a92-47f1-855c-399a8c590f7f\") " pod="openstack/nova-cell1-conductor-0" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.534904 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5860289-2a92-47f1-855c-399a8c590f7f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c5860289-2a92-47f1-855c-399a8c590f7f\") " pod="openstack/nova-cell1-conductor-0" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.542399 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5860289-2a92-47f1-855c-399a8c590f7f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c5860289-2a92-47f1-855c-399a8c590f7f\") " pod="openstack/nova-cell1-conductor-0" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.543838 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5860289-2a92-47f1-855c-399a8c590f7f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c5860289-2a92-47f1-855c-399a8c590f7f\") " pod="openstack/nova-cell1-conductor-0" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.557467 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cd5q\" (UniqueName: \"kubernetes.io/projected/c5860289-2a92-47f1-855c-399a8c590f7f-kube-api-access-6cd5q\") pod \"nova-cell1-conductor-0\" (UID: \"c5860289-2a92-47f1-855c-399a8c590f7f\") " pod="openstack/nova-cell1-conductor-0" Dec 02 20:23:27 crc kubenswrapper[4807]: I1202 20:23:27.635182 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 20:23:28 crc kubenswrapper[4807]: I1202 20:23:28.179250 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 20:23:28 crc kubenswrapper[4807]: W1202 20:23:28.181024 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5860289_2a92_47f1_855c_399a8c590f7f.slice/crio-89c0567ea2fb766942b27075b3348ee8e4b4a5100b46975f375cb7f5fbe261f8 WatchSource:0}: Error finding container 89c0567ea2fb766942b27075b3348ee8e4b4a5100b46975f375cb7f5fbe261f8: Status 404 returned error can't find the container with id 89c0567ea2fb766942b27075b3348ee8e4b4a5100b46975f375cb7f5fbe261f8 Dec 02 20:23:28 crc kubenswrapper[4807]: I1202 20:23:28.246180 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93b166c0-bf72-4774-97dc-f22b3b02f15a","Type":"ContainerStarted","Data":"7c5abb924772793a51bb041fe8197748c0a23a5cb9eb2a2ab3bcb40974409409"} Dec 02 20:23:28 crc kubenswrapper[4807]: I1202 20:23:28.246463 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93b166c0-bf72-4774-97dc-f22b3b02f15a","Type":"ContainerStarted","Data":"78ce639c08d471f6b7f4abc5a9a6915b418f13589906c6bc30d94bbccbaf46b6"} Dec 02 20:23:28 crc kubenswrapper[4807]: I1202 20:23:28.246475 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93b166c0-bf72-4774-97dc-f22b3b02f15a","Type":"ContainerStarted","Data":"003dd17181e1d63f72bfd87568cb33bae48e5fc6567ac3258b6d7bbfafa01eb7"} Dec 02 20:23:28 crc kubenswrapper[4807]: I1202 20:23:28.247830 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c5860289-2a92-47f1-855c-399a8c590f7f","Type":"ContainerStarted","Data":"89c0567ea2fb766942b27075b3348ee8e4b4a5100b46975f375cb7f5fbe261f8"} Dec 02 20:23:28 crc kubenswrapper[4807]: I1202 20:23:28.284422 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.284391913 podStartE2EDuration="2.284391913s" podCreationTimestamp="2025-12-02 20:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:23:28.272443527 +0000 UTC m=+1543.573351022" watchObservedRunningTime="2025-12-02 20:23:28.284391913 +0000 UTC m=+1543.585299408" Dec 02 20:23:28 crc kubenswrapper[4807]: I1202 20:23:28.293833 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:23:28 crc kubenswrapper[4807]: I1202 20:23:28.293969 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.083451 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.183122 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7142a8b-a137-4559-a955-9544266902a6-logs\") pod \"e7142a8b-a137-4559-a955-9544266902a6\" (UID: \"e7142a8b-a137-4559-a955-9544266902a6\") " Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.183360 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7r2w\" (UniqueName: \"kubernetes.io/projected/e7142a8b-a137-4559-a955-9544266902a6-kube-api-access-w7r2w\") pod \"e7142a8b-a137-4559-a955-9544266902a6\" (UID: \"e7142a8b-a137-4559-a955-9544266902a6\") " Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.183424 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7142a8b-a137-4559-a955-9544266902a6-config-data\") pod \"e7142a8b-a137-4559-a955-9544266902a6\" (UID: \"e7142a8b-a137-4559-a955-9544266902a6\") " Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.183476 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7142a8b-a137-4559-a955-9544266902a6-combined-ca-bundle\") pod \"e7142a8b-a137-4559-a955-9544266902a6\" (UID: \"e7142a8b-a137-4559-a955-9544266902a6\") " Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.184336 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7142a8b-a137-4559-a955-9544266902a6-logs" (OuterVolumeSpecName: "logs") pod "e7142a8b-a137-4559-a955-9544266902a6" (UID: "e7142a8b-a137-4559-a955-9544266902a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.201376 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7142a8b-a137-4559-a955-9544266902a6-kube-api-access-w7r2w" (OuterVolumeSpecName: "kube-api-access-w7r2w") pod "e7142a8b-a137-4559-a955-9544266902a6" (UID: "e7142a8b-a137-4559-a955-9544266902a6"). InnerVolumeSpecName "kube-api-access-w7r2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.214996 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7142a8b-a137-4559-a955-9544266902a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7142a8b-a137-4559-a955-9544266902a6" (UID: "e7142a8b-a137-4559-a955-9544266902a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.217003 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7142a8b-a137-4559-a955-9544266902a6-config-data" (OuterVolumeSpecName: "config-data") pod "e7142a8b-a137-4559-a955-9544266902a6" (UID: "e7142a8b-a137-4559-a955-9544266902a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.266041 4807 generic.go:334] "Generic (PLEG): container finished" podID="e7142a8b-a137-4559-a955-9544266902a6" containerID="02c1ec521ae711bbed2ec54afee6650b03a4a822eb59cd3e9fa6c9240bd1aec3" exitCode=0 Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.266124 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7142a8b-a137-4559-a955-9544266902a6","Type":"ContainerDied","Data":"02c1ec521ae711bbed2ec54afee6650b03a4a822eb59cd3e9fa6c9240bd1aec3"} Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.266161 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7142a8b-a137-4559-a955-9544266902a6","Type":"ContainerDied","Data":"9728e41a1d04687e855d26a682698f007bdaed95c75ccc4697ea20bbdca2838d"} Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.266183 4807 scope.go:117] "RemoveContainer" containerID="02c1ec521ae711bbed2ec54afee6650b03a4a822eb59cd3e9fa6c9240bd1aec3" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.266364 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.281035 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c5860289-2a92-47f1-855c-399a8c590f7f","Type":"ContainerStarted","Data":"f201a6ae6e1610b7db5d46a57769c3eb9c9bd3018ffb5f32a49132975d8cf92e"} Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.281473 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.285993 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7r2w\" (UniqueName: \"kubernetes.io/projected/e7142a8b-a137-4559-a955-9544266902a6-kube-api-access-w7r2w\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.286025 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7142a8b-a137-4559-a955-9544266902a6-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.286037 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7142a8b-a137-4559-a955-9544266902a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.286049 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7142a8b-a137-4559-a955-9544266902a6-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.311326 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.311293825 podStartE2EDuration="2.311293825s" podCreationTimestamp="2025-12-02 20:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:23:29.303250075 +0000 UTC m=+1544.604157570" watchObservedRunningTime="2025-12-02 20:23:29.311293825 +0000 UTC m=+1544.612201320" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.342409 4807 scope.go:117] "RemoveContainer" containerID="1b4d12802e9c948ee70e4cc3b2a07c755404462b9c230d7dc748893065e6e4e7" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.364603 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.387064 4807 scope.go:117] "RemoveContainer" containerID="02c1ec521ae711bbed2ec54afee6650b03a4a822eb59cd3e9fa6c9240bd1aec3" Dec 02 20:23:29 crc kubenswrapper[4807]: E1202 20:23:29.387697 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02c1ec521ae711bbed2ec54afee6650b03a4a822eb59cd3e9fa6c9240bd1aec3\": container with ID starting with 02c1ec521ae711bbed2ec54afee6650b03a4a822eb59cd3e9fa6c9240bd1aec3 not found: ID does not exist" containerID="02c1ec521ae711bbed2ec54afee6650b03a4a822eb59cd3e9fa6c9240bd1aec3" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.387746 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02c1ec521ae711bbed2ec54afee6650b03a4a822eb59cd3e9fa6c9240bd1aec3"} err="failed to get container status \"02c1ec521ae711bbed2ec54afee6650b03a4a822eb59cd3e9fa6c9240bd1aec3\": rpc error: code = NotFound desc = could not find container \"02c1ec521ae711bbed2ec54afee6650b03a4a822eb59cd3e9fa6c9240bd1aec3\": container with ID starting with 02c1ec521ae711bbed2ec54afee6650b03a4a822eb59cd3e9fa6c9240bd1aec3 not found: ID does not exist" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.387772 4807 scope.go:117] "RemoveContainer" containerID="1b4d12802e9c948ee70e4cc3b2a07c755404462b9c230d7dc748893065e6e4e7" Dec 02 20:23:29 crc kubenswrapper[4807]: E1202 20:23:29.388332 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b4d12802e9c948ee70e4cc3b2a07c755404462b9c230d7dc748893065e6e4e7\": container with ID starting with 1b4d12802e9c948ee70e4cc3b2a07c755404462b9c230d7dc748893065e6e4e7 not found: ID does not exist" containerID="1b4d12802e9c948ee70e4cc3b2a07c755404462b9c230d7dc748893065e6e4e7" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.388359 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b4d12802e9c948ee70e4cc3b2a07c755404462b9c230d7dc748893065e6e4e7"} err="failed to get container status \"1b4d12802e9c948ee70e4cc3b2a07c755404462b9c230d7dc748893065e6e4e7\": rpc error: code = NotFound desc = could not find container \"1b4d12802e9c948ee70e4cc3b2a07c755404462b9c230d7dc748893065e6e4e7\": container with ID starting with 1b4d12802e9c948ee70e4cc3b2a07c755404462b9c230d7dc748893065e6e4e7 not found: ID does not exist" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.399484 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.427938 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 20:23:29 crc kubenswrapper[4807]: E1202 20:23:29.428964 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7142a8b-a137-4559-a955-9544266902a6" containerName="nova-api-log" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.428999 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7142a8b-a137-4559-a955-9544266902a6" containerName="nova-api-log" Dec 02 20:23:29 crc kubenswrapper[4807]: E1202 20:23:29.429025 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7142a8b-a137-4559-a955-9544266902a6" containerName="nova-api-api" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.429036 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7142a8b-a137-4559-a955-9544266902a6" containerName="nova-api-api" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.429325 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7142a8b-a137-4559-a955-9544266902a6" containerName="nova-api-api" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.429380 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7142a8b-a137-4559-a955-9544266902a6" containerName="nova-api-log" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.431080 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.433993 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.441301 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.594262 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4z7q\" (UniqueName: \"kubernetes.io/projected/0e02b468-5c68-4284-8a76-2be6380aeb8b-kube-api-access-q4z7q\") pod \"nova-api-0\" (UID: \"0e02b468-5c68-4284-8a76-2be6380aeb8b\") " pod="openstack/nova-api-0" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.595267 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e02b468-5c68-4284-8a76-2be6380aeb8b-config-data\") pod \"nova-api-0\" (UID: \"0e02b468-5c68-4284-8a76-2be6380aeb8b\") " pod="openstack/nova-api-0" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.595520 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e02b468-5c68-4284-8a76-2be6380aeb8b-logs\") pod \"nova-api-0\" (UID: \"0e02b468-5c68-4284-8a76-2be6380aeb8b\") " pod="openstack/nova-api-0" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.595736 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e02b468-5c68-4284-8a76-2be6380aeb8b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e02b468-5c68-4284-8a76-2be6380aeb8b\") " pod="openstack/nova-api-0" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.698079 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e02b468-5c68-4284-8a76-2be6380aeb8b-config-data\") pod \"nova-api-0\" (UID: \"0e02b468-5c68-4284-8a76-2be6380aeb8b\") " pod="openstack/nova-api-0" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.698176 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e02b468-5c68-4284-8a76-2be6380aeb8b-logs\") pod \"nova-api-0\" (UID: \"0e02b468-5c68-4284-8a76-2be6380aeb8b\") " pod="openstack/nova-api-0" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.698223 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e02b468-5c68-4284-8a76-2be6380aeb8b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e02b468-5c68-4284-8a76-2be6380aeb8b\") " pod="openstack/nova-api-0" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.698270 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4z7q\" (UniqueName: \"kubernetes.io/projected/0e02b468-5c68-4284-8a76-2be6380aeb8b-kube-api-access-q4z7q\") pod \"nova-api-0\" (UID: \"0e02b468-5c68-4284-8a76-2be6380aeb8b\") " pod="openstack/nova-api-0" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.698603 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e02b468-5c68-4284-8a76-2be6380aeb8b-logs\") pod \"nova-api-0\" (UID: \"0e02b468-5c68-4284-8a76-2be6380aeb8b\") " pod="openstack/nova-api-0" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.704144 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e02b468-5c68-4284-8a76-2be6380aeb8b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e02b468-5c68-4284-8a76-2be6380aeb8b\") " pod="openstack/nova-api-0" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.704324 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e02b468-5c68-4284-8a76-2be6380aeb8b-config-data\") pod \"nova-api-0\" (UID: \"0e02b468-5c68-4284-8a76-2be6380aeb8b\") " pod="openstack/nova-api-0" Dec 02 20:23:29 crc kubenswrapper[4807]: E1202 20:23:29.723615 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91f98c155631acf2d9df29ca729301e14f1510fe56bccae37100dede184cda1d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.724245 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4z7q\" (UniqueName: \"kubernetes.io/projected/0e02b468-5c68-4284-8a76-2be6380aeb8b-kube-api-access-q4z7q\") pod \"nova-api-0\" (UID: \"0e02b468-5c68-4284-8a76-2be6380aeb8b\") " pod="openstack/nova-api-0" Dec 02 20:23:29 crc kubenswrapper[4807]: E1202 20:23:29.725800 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91f98c155631acf2d9df29ca729301e14f1510fe56bccae37100dede184cda1d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 20:23:29 crc kubenswrapper[4807]: E1202 20:23:29.727146 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91f98c155631acf2d9df29ca729301e14f1510fe56bccae37100dede184cda1d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 20:23:29 crc kubenswrapper[4807]: E1202 20:23:29.727191 4807 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="34c5fea6-3631-42fa-9e77-7b4f1e714aec" containerName="nova-scheduler-scheduler" Dec 02 20:23:29 crc kubenswrapper[4807]: I1202 20:23:29.751800 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 20:23:30 crc kubenswrapper[4807]: I1202 20:23:30.389140 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:23:30 crc kubenswrapper[4807]: I1202 20:23:30.431314 4807 generic.go:334] "Generic (PLEG): container finished" podID="34c5fea6-3631-42fa-9e77-7b4f1e714aec" containerID="91f98c155631acf2d9df29ca729301e14f1510fe56bccae37100dede184cda1d" exitCode=0 Dec 02 20:23:30 crc kubenswrapper[4807]: I1202 20:23:30.432134 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"34c5fea6-3631-42fa-9e77-7b4f1e714aec","Type":"ContainerDied","Data":"91f98c155631acf2d9df29ca729301e14f1510fe56bccae37100dede184cda1d"} Dec 02 20:23:30 crc kubenswrapper[4807]: I1202 20:23:30.487849 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 20:23:30 crc kubenswrapper[4807]: I1202 20:23:30.643273 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c5fea6-3631-42fa-9e77-7b4f1e714aec-combined-ca-bundle\") pod \"34c5fea6-3631-42fa-9e77-7b4f1e714aec\" (UID: \"34c5fea6-3631-42fa-9e77-7b4f1e714aec\") " Dec 02 20:23:30 crc kubenswrapper[4807]: I1202 20:23:30.645795 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwcb4\" (UniqueName: \"kubernetes.io/projected/34c5fea6-3631-42fa-9e77-7b4f1e714aec-kube-api-access-pwcb4\") pod \"34c5fea6-3631-42fa-9e77-7b4f1e714aec\" (UID: \"34c5fea6-3631-42fa-9e77-7b4f1e714aec\") " Dec 02 20:23:30 crc kubenswrapper[4807]: I1202 20:23:30.645915 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c5fea6-3631-42fa-9e77-7b4f1e714aec-config-data\") pod \"34c5fea6-3631-42fa-9e77-7b4f1e714aec\" (UID: \"34c5fea6-3631-42fa-9e77-7b4f1e714aec\") " Dec 02 20:23:30 crc kubenswrapper[4807]: I1202 20:23:30.652953 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34c5fea6-3631-42fa-9e77-7b4f1e714aec-kube-api-access-pwcb4" (OuterVolumeSpecName: "kube-api-access-pwcb4") pod "34c5fea6-3631-42fa-9e77-7b4f1e714aec" (UID: "34c5fea6-3631-42fa-9e77-7b4f1e714aec"). InnerVolumeSpecName "kube-api-access-pwcb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:23:30 crc kubenswrapper[4807]: I1202 20:23:30.683497 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34c5fea6-3631-42fa-9e77-7b4f1e714aec-config-data" (OuterVolumeSpecName: "config-data") pod "34c5fea6-3631-42fa-9e77-7b4f1e714aec" (UID: "34c5fea6-3631-42fa-9e77-7b4f1e714aec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:30 crc kubenswrapper[4807]: I1202 20:23:30.704204 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34c5fea6-3631-42fa-9e77-7b4f1e714aec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34c5fea6-3631-42fa-9e77-7b4f1e714aec" (UID: "34c5fea6-3631-42fa-9e77-7b4f1e714aec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:30 crc kubenswrapper[4807]: I1202 20:23:30.751056 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c5fea6-3631-42fa-9e77-7b4f1e714aec-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:30 crc kubenswrapper[4807]: I1202 20:23:30.751705 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c5fea6-3631-42fa-9e77-7b4f1e714aec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:30 crc kubenswrapper[4807]: I1202 20:23:30.751849 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwcb4\" (UniqueName: \"kubernetes.io/projected/34c5fea6-3631-42fa-9e77-7b4f1e714aec-kube-api-access-pwcb4\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.004830 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7142a8b-a137-4559-a955-9544266902a6" path="/var/lib/kubelet/pods/e7142a8b-a137-4559-a955-9544266902a6/volumes" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.447548 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e02b468-5c68-4284-8a76-2be6380aeb8b","Type":"ContainerStarted","Data":"026cdc95ff849bfa4ccf3a985e3e7352c475aca8ed96f7ef2cb4676ced9dfa85"} Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.449114 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e02b468-5c68-4284-8a76-2be6380aeb8b","Type":"ContainerStarted","Data":"6a1c16a32af0158ac6b1eed73892096ea38f9054a47c43e0bbd0f999eddade21"} Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.449197 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e02b468-5c68-4284-8a76-2be6380aeb8b","Type":"ContainerStarted","Data":"8f8fbbfdaac2aa162423c5f413086a96bae3f27b4f26f21e453b1c6a8827684e"} Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.450450 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"34c5fea6-3631-42fa-9e77-7b4f1e714aec","Type":"ContainerDied","Data":"5118cdd885e702f844eae4352be8faeb668a9527da1e03e93b003f0d192cf932"} Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.450532 4807 scope.go:117] "RemoveContainer" containerID="91f98c155631acf2d9df29ca729301e14f1510fe56bccae37100dede184cda1d" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.450673 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.500980 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5009460949999998 podStartE2EDuration="2.500946095s" podCreationTimestamp="2025-12-02 20:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:23:31.476316892 +0000 UTC m=+1546.777224397" watchObservedRunningTime="2025-12-02 20:23:31.500946095 +0000 UTC m=+1546.801853590" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.528067 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.563541 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.582115 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 20:23:31 crc kubenswrapper[4807]: E1202 20:23:31.583602 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c5fea6-3631-42fa-9e77-7b4f1e714aec" containerName="nova-scheduler-scheduler" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.583633 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c5fea6-3631-42fa-9e77-7b4f1e714aec" containerName="nova-scheduler-scheduler" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.584337 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="34c5fea6-3631-42fa-9e77-7b4f1e714aec" containerName="nova-scheduler-scheduler" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.585949 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.604675 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.604690 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.692371 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d49888e-5c74-4550-b491-8b526b9fe3a8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6d49888e-5c74-4550-b491-8b526b9fe3a8\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.692508 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d49888e-5c74-4550-b491-8b526b9fe3a8-config-data\") pod \"nova-scheduler-0\" (UID: \"6d49888e-5c74-4550-b491-8b526b9fe3a8\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.692675 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdnt2\" (UniqueName: \"kubernetes.io/projected/6d49888e-5c74-4550-b491-8b526b9fe3a8-kube-api-access-hdnt2\") pod \"nova-scheduler-0\" (UID: \"6d49888e-5c74-4550-b491-8b526b9fe3a8\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.795134 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdnt2\" (UniqueName: \"kubernetes.io/projected/6d49888e-5c74-4550-b491-8b526b9fe3a8-kube-api-access-hdnt2\") pod \"nova-scheduler-0\" (UID: \"6d49888e-5c74-4550-b491-8b526b9fe3a8\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.795238 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d49888e-5c74-4550-b491-8b526b9fe3a8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6d49888e-5c74-4550-b491-8b526b9fe3a8\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.795282 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d49888e-5c74-4550-b491-8b526b9fe3a8-config-data\") pod \"nova-scheduler-0\" (UID: \"6d49888e-5c74-4550-b491-8b526b9fe3a8\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.804547 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d49888e-5c74-4550-b491-8b526b9fe3a8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6d49888e-5c74-4550-b491-8b526b9fe3a8\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.805434 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d49888e-5c74-4550-b491-8b526b9fe3a8-config-data\") pod \"nova-scheduler-0\" (UID: \"6d49888e-5c74-4550-b491-8b526b9fe3a8\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.828418 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdnt2\" (UniqueName: \"kubernetes.io/projected/6d49888e-5c74-4550-b491-8b526b9fe3a8-kube-api-access-hdnt2\") pod \"nova-scheduler-0\" (UID: \"6d49888e-5c74-4550-b491-8b526b9fe3a8\") " pod="openstack/nova-scheduler-0" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.922209 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.987629 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 20:23:31 crc kubenswrapper[4807]: I1202 20:23:31.988229 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 20:23:32 crc kubenswrapper[4807]: I1202 20:23:32.466414 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 20:23:32 crc kubenswrapper[4807]: I1202 20:23:32.488873 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6d49888e-5c74-4550-b491-8b526b9fe3a8","Type":"ContainerStarted","Data":"159cbfde0be03e78fd11b892b87f2ed9888e3a56d2a0a1cb2008be2d4b303599"} Dec 02 20:23:32 crc kubenswrapper[4807]: I1202 20:23:32.987405 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34c5fea6-3631-42fa-9e77-7b4f1e714aec" path="/var/lib/kubelet/pods/34c5fea6-3631-42fa-9e77-7b4f1e714aec/volumes" Dec 02 20:23:33 crc kubenswrapper[4807]: I1202 20:23:33.509988 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6d49888e-5c74-4550-b491-8b526b9fe3a8","Type":"ContainerStarted","Data":"d9c534ccd25cbca66b8c1cb2a6f93fd09e50be6710183062e4025e1b1fb1780e"} Dec 02 20:23:33 crc kubenswrapper[4807]: I1202 20:23:33.531300 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.531274149 podStartE2EDuration="2.531274149s" podCreationTimestamp="2025-12-02 20:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:23:33.527308461 +0000 UTC m=+1548.828215976" watchObservedRunningTime="2025-12-02 20:23:33.531274149 +0000 UTC m=+1548.832181634" Dec 02 20:23:35 crc kubenswrapper[4807]: I1202 20:23:35.156294 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z4h4f"] Dec 02 20:23:35 crc kubenswrapper[4807]: I1202 20:23:35.160556 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:35 crc kubenswrapper[4807]: I1202 20:23:35.171074 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z4h4f"] Dec 02 20:23:35 crc kubenswrapper[4807]: I1202 20:23:35.215981 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb9kz\" (UniqueName: \"kubernetes.io/projected/868416ad-83ed-4fa1-897b-14f454303f2c-kube-api-access-pb9kz\") pod \"community-operators-z4h4f\" (UID: \"868416ad-83ed-4fa1-897b-14f454303f2c\") " pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:35 crc kubenswrapper[4807]: I1202 20:23:35.216087 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/868416ad-83ed-4fa1-897b-14f454303f2c-utilities\") pod \"community-operators-z4h4f\" (UID: \"868416ad-83ed-4fa1-897b-14f454303f2c\") " pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:35 crc kubenswrapper[4807]: I1202 20:23:35.216224 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/868416ad-83ed-4fa1-897b-14f454303f2c-catalog-content\") pod \"community-operators-z4h4f\" (UID: \"868416ad-83ed-4fa1-897b-14f454303f2c\") " pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:35 crc kubenswrapper[4807]: I1202 20:23:35.318832 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/868416ad-83ed-4fa1-897b-14f454303f2c-utilities\") pod \"community-operators-z4h4f\" (UID: \"868416ad-83ed-4fa1-897b-14f454303f2c\") " pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:35 crc kubenswrapper[4807]: I1202 20:23:35.318977 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/868416ad-83ed-4fa1-897b-14f454303f2c-catalog-content\") pod \"community-operators-z4h4f\" (UID: \"868416ad-83ed-4fa1-897b-14f454303f2c\") " pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:35 crc kubenswrapper[4807]: I1202 20:23:35.319365 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb9kz\" (UniqueName: \"kubernetes.io/projected/868416ad-83ed-4fa1-897b-14f454303f2c-kube-api-access-pb9kz\") pod \"community-operators-z4h4f\" (UID: \"868416ad-83ed-4fa1-897b-14f454303f2c\") " pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:35 crc kubenswrapper[4807]: I1202 20:23:35.319666 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/868416ad-83ed-4fa1-897b-14f454303f2c-utilities\") pod \"community-operators-z4h4f\" (UID: \"868416ad-83ed-4fa1-897b-14f454303f2c\") " pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:35 crc kubenswrapper[4807]: I1202 20:23:35.319705 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/868416ad-83ed-4fa1-897b-14f454303f2c-catalog-content\") pod \"community-operators-z4h4f\" (UID: \"868416ad-83ed-4fa1-897b-14f454303f2c\") " pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:35 crc kubenswrapper[4807]: I1202 20:23:35.350906 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb9kz\" (UniqueName: \"kubernetes.io/projected/868416ad-83ed-4fa1-897b-14f454303f2c-kube-api-access-pb9kz\") pod \"community-operators-z4h4f\" (UID: \"868416ad-83ed-4fa1-897b-14f454303f2c\") " pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:35 crc kubenswrapper[4807]: I1202 20:23:35.502169 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:36 crc kubenswrapper[4807]: I1202 20:23:36.106818 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z4h4f"] Dec 02 20:23:36 crc kubenswrapper[4807]: W1202 20:23:36.111175 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod868416ad_83ed_4fa1_897b_14f454303f2c.slice/crio-adb16d3bed6757f2fae99f369613e3282b8908a28baf2bc7658ddce7723aaede WatchSource:0}: Error finding container adb16d3bed6757f2fae99f369613e3282b8908a28baf2bc7658ddce7723aaede: Status 404 returned error can't find the container with id adb16d3bed6757f2fae99f369613e3282b8908a28baf2bc7658ddce7723aaede Dec 02 20:23:36 crc kubenswrapper[4807]: I1202 20:23:36.548502 4807 generic.go:334] "Generic (PLEG): container finished" podID="868416ad-83ed-4fa1-897b-14f454303f2c" containerID="202806e184213dd241b7c52d6cf6d04077fe00fd99e4b87c811d86e26de0d1c1" exitCode=0 Dec 02 20:23:36 crc kubenswrapper[4807]: I1202 20:23:36.548618 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4h4f" event={"ID":"868416ad-83ed-4fa1-897b-14f454303f2c","Type":"ContainerDied","Data":"202806e184213dd241b7c52d6cf6d04077fe00fd99e4b87c811d86e26de0d1c1"} Dec 02 20:23:36 crc kubenswrapper[4807]: I1202 20:23:36.549014 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4h4f" event={"ID":"868416ad-83ed-4fa1-897b-14f454303f2c","Type":"ContainerStarted","Data":"adb16d3bed6757f2fae99f369613e3282b8908a28baf2bc7658ddce7723aaede"} Dec 02 20:23:36 crc kubenswrapper[4807]: I1202 20:23:36.922357 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 20:23:36 crc kubenswrapper[4807]: I1202 20:23:36.993453 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 20:23:36 crc kubenswrapper[4807]: I1202 20:23:36.993592 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 20:23:37 crc kubenswrapper[4807]: I1202 20:23:37.561854 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4h4f" event={"ID":"868416ad-83ed-4fa1-897b-14f454303f2c","Type":"ContainerStarted","Data":"d63ba4da3ad7c4c0a74d02cde067e5ac7cff5c6478f5bd7fce0928e0d27c71bf"} Dec 02 20:23:37 crc kubenswrapper[4807]: I1202 20:23:37.672637 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 02 20:23:38 crc kubenswrapper[4807]: I1202 20:23:38.004048 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="93b166c0-bf72-4774-97dc-f22b3b02f15a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 20:23:38 crc kubenswrapper[4807]: I1202 20:23:38.004558 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="93b166c0-bf72-4774-97dc-f22b3b02f15a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 20:23:38 crc kubenswrapper[4807]: I1202 20:23:38.581634 4807 generic.go:334] "Generic (PLEG): container finished" podID="868416ad-83ed-4fa1-897b-14f454303f2c" containerID="d63ba4da3ad7c4c0a74d02cde067e5ac7cff5c6478f5bd7fce0928e0d27c71bf" exitCode=0 Dec 02 20:23:38 crc kubenswrapper[4807]: I1202 20:23:38.581700 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4h4f" event={"ID":"868416ad-83ed-4fa1-897b-14f454303f2c","Type":"ContainerDied","Data":"d63ba4da3ad7c4c0a74d02cde067e5ac7cff5c6478f5bd7fce0928e0d27c71bf"} Dec 02 20:23:39 crc kubenswrapper[4807]: I1202 20:23:39.753275 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 20:23:39 crc kubenswrapper[4807]: I1202 20:23:39.753875 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 20:23:40 crc kubenswrapper[4807]: I1202 20:23:40.836096 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e02b468-5c68-4284-8a76-2be6380aeb8b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 20:23:40 crc kubenswrapper[4807]: I1202 20:23:40.836144 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e02b468-5c68-4284-8a76-2be6380aeb8b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 20:23:41 crc kubenswrapper[4807]: I1202 20:23:41.631821 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4h4f" event={"ID":"868416ad-83ed-4fa1-897b-14f454303f2c","Type":"ContainerStarted","Data":"10472eec0cf914f64be6b2161059a6a76a16a4ca87702b7c19fc720d257ae970"} Dec 02 20:23:41 crc kubenswrapper[4807]: I1202 20:23:41.659533 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z4h4f" podStartSLOduration=2.8000595600000002 podStartE2EDuration="6.659509984s" podCreationTimestamp="2025-12-02 20:23:35 +0000 UTC" firstStartedPulling="2025-12-02 20:23:36.551077062 +0000 UTC m=+1551.851984557" lastFinishedPulling="2025-12-02 20:23:40.410527446 +0000 UTC m=+1555.711434981" observedRunningTime="2025-12-02 20:23:41.653741382 +0000 UTC m=+1556.954648887" watchObservedRunningTime="2025-12-02 20:23:41.659509984 +0000 UTC m=+1556.960417479" Dec 02 20:23:41 crc kubenswrapper[4807]: I1202 20:23:41.923255 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 20:23:41 crc kubenswrapper[4807]: I1202 20:23:41.955818 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 20:23:42 crc kubenswrapper[4807]: I1202 20:23:42.685521 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 20:23:42 crc kubenswrapper[4807]: I1202 20:23:42.750984 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 20:23:45 crc kubenswrapper[4807]: I1202 20:23:45.502744 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:45 crc kubenswrapper[4807]: I1202 20:23:45.503795 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:45 crc kubenswrapper[4807]: I1202 20:23:45.567351 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:45 crc kubenswrapper[4807]: I1202 20:23:45.755142 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:45 crc kubenswrapper[4807]: I1202 20:23:45.829037 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z4h4f"] Dec 02 20:23:46 crc kubenswrapper[4807]: I1202 20:23:46.997632 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:46.998351 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.004907 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.006024 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.351124 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.532144 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a435bd-d058-4ef5-a457-e23bcccfb168-config-data\") pod \"b9a435bd-d058-4ef5-a457-e23bcccfb168\" (UID: \"b9a435bd-d058-4ef5-a457-e23bcccfb168\") " Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.532373 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a435bd-d058-4ef5-a457-e23bcccfb168-combined-ca-bundle\") pod \"b9a435bd-d058-4ef5-a457-e23bcccfb168\" (UID: \"b9a435bd-d058-4ef5-a457-e23bcccfb168\") " Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.532857 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v4th\" (UniqueName: \"kubernetes.io/projected/b9a435bd-d058-4ef5-a457-e23bcccfb168-kube-api-access-4v4th\") pod \"b9a435bd-d058-4ef5-a457-e23bcccfb168\" (UID: \"b9a435bd-d058-4ef5-a457-e23bcccfb168\") " Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.545936 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9a435bd-d058-4ef5-a457-e23bcccfb168-kube-api-access-4v4th" (OuterVolumeSpecName: "kube-api-access-4v4th") pod "b9a435bd-d058-4ef5-a457-e23bcccfb168" (UID: "b9a435bd-d058-4ef5-a457-e23bcccfb168"). InnerVolumeSpecName "kube-api-access-4v4th". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.568945 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a435bd-d058-4ef5-a457-e23bcccfb168-config-data" (OuterVolumeSpecName: "config-data") pod "b9a435bd-d058-4ef5-a457-e23bcccfb168" (UID: "b9a435bd-d058-4ef5-a457-e23bcccfb168"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.572112 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a435bd-d058-4ef5-a457-e23bcccfb168-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9a435bd-d058-4ef5-a457-e23bcccfb168" (UID: "b9a435bd-d058-4ef5-a457-e23bcccfb168"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.636632 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a435bd-d058-4ef5-a457-e23bcccfb168-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.636690 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a435bd-d058-4ef5-a457-e23bcccfb168-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.636710 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v4th\" (UniqueName: \"kubernetes.io/projected/b9a435bd-d058-4ef5-a457-e23bcccfb168-kube-api-access-4v4th\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.726137 4807 generic.go:334] "Generic (PLEG): container finished" podID="b9a435bd-d058-4ef5-a457-e23bcccfb168" containerID="09dae8634369357ce6fd869b14fe74fa172d69e1f376fb210b660079c7227c34" exitCode=137 Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.726224 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9a435bd-d058-4ef5-a457-e23bcccfb168","Type":"ContainerDied","Data":"09dae8634369357ce6fd869b14fe74fa172d69e1f376fb210b660079c7227c34"} Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.726315 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9a435bd-d058-4ef5-a457-e23bcccfb168","Type":"ContainerDied","Data":"0ea22dd8358dd5cb482dcdde7d2d072844fbbbab9f0fc468f63ba11c9469c2b9"} Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.726342 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.726343 4807 scope.go:117] "RemoveContainer" containerID="09dae8634369357ce6fd869b14fe74fa172d69e1f376fb210b660079c7227c34" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.726990 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z4h4f" podUID="868416ad-83ed-4fa1-897b-14f454303f2c" containerName="registry-server" containerID="cri-o://10472eec0cf914f64be6b2161059a6a76a16a4ca87702b7c19fc720d257ae970" gracePeriod=2 Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.794919 4807 scope.go:117] "RemoveContainer" containerID="09dae8634369357ce6fd869b14fe74fa172d69e1f376fb210b660079c7227c34" Dec 02 20:23:47 crc kubenswrapper[4807]: E1202 20:23:47.798259 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09dae8634369357ce6fd869b14fe74fa172d69e1f376fb210b660079c7227c34\": container with ID starting with 09dae8634369357ce6fd869b14fe74fa172d69e1f376fb210b660079c7227c34 not found: ID does not exist" containerID="09dae8634369357ce6fd869b14fe74fa172d69e1f376fb210b660079c7227c34" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.798310 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09dae8634369357ce6fd869b14fe74fa172d69e1f376fb210b660079c7227c34"} err="failed to get container status \"09dae8634369357ce6fd869b14fe74fa172d69e1f376fb210b660079c7227c34\": rpc error: code = NotFound desc = could not find container \"09dae8634369357ce6fd869b14fe74fa172d69e1f376fb210b660079c7227c34\": container with ID starting with 09dae8634369357ce6fd869b14fe74fa172d69e1f376fb210b660079c7227c34 not found: ID does not exist" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.829906 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.865079 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.876645 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 20:23:47 crc kubenswrapper[4807]: E1202 20:23:47.877454 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a435bd-d058-4ef5-a457-e23bcccfb168" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.877482 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a435bd-d058-4ef5-a457-e23bcccfb168" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.877748 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9a435bd-d058-4ef5-a457-e23bcccfb168" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.879004 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.885947 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.886044 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.886174 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 20:23:47 crc kubenswrapper[4807]: I1202 20:23:47.886320 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.049919 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc1416d8-8665-48f9-ad43-b2e16b6a5ecb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.050040 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1416d8-8665-48f9-ad43-b2e16b6a5ecb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.050091 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wszg\" (UniqueName: \"kubernetes.io/projected/fc1416d8-8665-48f9-ad43-b2e16b6a5ecb-kube-api-access-2wszg\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.052858 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc1416d8-8665-48f9-ad43-b2e16b6a5ecb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.053023 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1416d8-8665-48f9-ad43-b2e16b6a5ecb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.156100 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc1416d8-8665-48f9-ad43-b2e16b6a5ecb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.156212 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1416d8-8665-48f9-ad43-b2e16b6a5ecb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.156276 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc1416d8-8665-48f9-ad43-b2e16b6a5ecb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.156353 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1416d8-8665-48f9-ad43-b2e16b6a5ecb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.156396 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wszg\" (UniqueName: \"kubernetes.io/projected/fc1416d8-8665-48f9-ad43-b2e16b6a5ecb-kube-api-access-2wszg\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.163473 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc1416d8-8665-48f9-ad43-b2e16b6a5ecb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.163537 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc1416d8-8665-48f9-ad43-b2e16b6a5ecb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.164989 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc1416d8-8665-48f9-ad43-b2e16b6a5ecb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.166843 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc1416d8-8665-48f9-ad43-b2e16b6a5ecb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.178668 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wszg\" (UniqueName: \"kubernetes.io/projected/fc1416d8-8665-48f9-ad43-b2e16b6a5ecb-kube-api-access-2wszg\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.212144 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.253019 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4j2hj"] Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.256371 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.258889 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11461dcc-2c59-4b98-b246-5908dca9a42b-utilities\") pod \"redhat-marketplace-4j2hj\" (UID: \"11461dcc-2c59-4b98-b246-5908dca9a42b\") " pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.262771 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4j2hj"] Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.259050 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11461dcc-2c59-4b98-b246-5908dca9a42b-catalog-content\") pod \"redhat-marketplace-4j2hj\" (UID: \"11461dcc-2c59-4b98-b246-5908dca9a42b\") " pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.262977 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lszb7\" (UniqueName: \"kubernetes.io/projected/11461dcc-2c59-4b98-b246-5908dca9a42b-kube-api-access-lszb7\") pod \"redhat-marketplace-4j2hj\" (UID: \"11461dcc-2c59-4b98-b246-5908dca9a42b\") " pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.366166 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11461dcc-2c59-4b98-b246-5908dca9a42b-catalog-content\") pod \"redhat-marketplace-4j2hj\" (UID: \"11461dcc-2c59-4b98-b246-5908dca9a42b\") " pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.366756 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lszb7\" (UniqueName: \"kubernetes.io/projected/11461dcc-2c59-4b98-b246-5908dca9a42b-kube-api-access-lszb7\") pod \"redhat-marketplace-4j2hj\" (UID: \"11461dcc-2c59-4b98-b246-5908dca9a42b\") " pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.366918 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11461dcc-2c59-4b98-b246-5908dca9a42b-utilities\") pod \"redhat-marketplace-4j2hj\" (UID: \"11461dcc-2c59-4b98-b246-5908dca9a42b\") " pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.366934 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11461dcc-2c59-4b98-b246-5908dca9a42b-catalog-content\") pod \"redhat-marketplace-4j2hj\" (UID: \"11461dcc-2c59-4b98-b246-5908dca9a42b\") " pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.367179 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11461dcc-2c59-4b98-b246-5908dca9a42b-utilities\") pod \"redhat-marketplace-4j2hj\" (UID: \"11461dcc-2c59-4b98-b246-5908dca9a42b\") " pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.397403 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lszb7\" (UniqueName: \"kubernetes.io/projected/11461dcc-2c59-4b98-b246-5908dca9a42b-kube-api-access-lszb7\") pod \"redhat-marketplace-4j2hj\" (UID: \"11461dcc-2c59-4b98-b246-5908dca9a42b\") " pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.398271 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.468968 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb9kz\" (UniqueName: \"kubernetes.io/projected/868416ad-83ed-4fa1-897b-14f454303f2c-kube-api-access-pb9kz\") pod \"868416ad-83ed-4fa1-897b-14f454303f2c\" (UID: \"868416ad-83ed-4fa1-897b-14f454303f2c\") " Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.469057 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/868416ad-83ed-4fa1-897b-14f454303f2c-utilities\") pod \"868416ad-83ed-4fa1-897b-14f454303f2c\" (UID: \"868416ad-83ed-4fa1-897b-14f454303f2c\") " Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.469468 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/868416ad-83ed-4fa1-897b-14f454303f2c-catalog-content\") pod \"868416ad-83ed-4fa1-897b-14f454303f2c\" (UID: \"868416ad-83ed-4fa1-897b-14f454303f2c\") " Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.470135 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/868416ad-83ed-4fa1-897b-14f454303f2c-utilities" (OuterVolumeSpecName: "utilities") pod "868416ad-83ed-4fa1-897b-14f454303f2c" (UID: "868416ad-83ed-4fa1-897b-14f454303f2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.470947 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/868416ad-83ed-4fa1-897b-14f454303f2c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.480755 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/868416ad-83ed-4fa1-897b-14f454303f2c-kube-api-access-pb9kz" (OuterVolumeSpecName: "kube-api-access-pb9kz") pod "868416ad-83ed-4fa1-897b-14f454303f2c" (UID: "868416ad-83ed-4fa1-897b-14f454303f2c"). InnerVolumeSpecName "kube-api-access-pb9kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.535519 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/868416ad-83ed-4fa1-897b-14f454303f2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "868416ad-83ed-4fa1-897b-14f454303f2c" (UID: "868416ad-83ed-4fa1-897b-14f454303f2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.573119 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb9kz\" (UniqueName: \"kubernetes.io/projected/868416ad-83ed-4fa1-897b-14f454303f2c-kube-api-access-pb9kz\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.573199 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/868416ad-83ed-4fa1-897b-14f454303f2c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.690511 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.750103 4807 generic.go:334] "Generic (PLEG): container finished" podID="868416ad-83ed-4fa1-897b-14f454303f2c" containerID="10472eec0cf914f64be6b2161059a6a76a16a4ca87702b7c19fc720d257ae970" exitCode=0 Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.750279 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z4h4f" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.751939 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4h4f" event={"ID":"868416ad-83ed-4fa1-897b-14f454303f2c","Type":"ContainerDied","Data":"10472eec0cf914f64be6b2161059a6a76a16a4ca87702b7c19fc720d257ae970"} Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.752073 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z4h4f" event={"ID":"868416ad-83ed-4fa1-897b-14f454303f2c","Type":"ContainerDied","Data":"adb16d3bed6757f2fae99f369613e3282b8908a28baf2bc7658ddce7723aaede"} Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.752107 4807 scope.go:117] "RemoveContainer" containerID="10472eec0cf914f64be6b2161059a6a76a16a4ca87702b7c19fc720d257ae970" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.770442 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.816581 4807 scope.go:117] "RemoveContainer" containerID="d63ba4da3ad7c4c0a74d02cde067e5ac7cff5c6478f5bd7fce0928e0d27c71bf" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.818566 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z4h4f"] Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.839382 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z4h4f"] Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.863332 4807 scope.go:117] "RemoveContainer" containerID="202806e184213dd241b7c52d6cf6d04077fe00fd99e4b87c811d86e26de0d1c1" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.943073 4807 scope.go:117] "RemoveContainer" containerID="10472eec0cf914f64be6b2161059a6a76a16a4ca87702b7c19fc720d257ae970" Dec 02 20:23:48 crc kubenswrapper[4807]: E1202 20:23:48.944683 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10472eec0cf914f64be6b2161059a6a76a16a4ca87702b7c19fc720d257ae970\": container with ID starting with 10472eec0cf914f64be6b2161059a6a76a16a4ca87702b7c19fc720d257ae970 not found: ID does not exist" containerID="10472eec0cf914f64be6b2161059a6a76a16a4ca87702b7c19fc720d257ae970" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.944726 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10472eec0cf914f64be6b2161059a6a76a16a4ca87702b7c19fc720d257ae970"} err="failed to get container status \"10472eec0cf914f64be6b2161059a6a76a16a4ca87702b7c19fc720d257ae970\": rpc error: code = NotFound desc = could not find container \"10472eec0cf914f64be6b2161059a6a76a16a4ca87702b7c19fc720d257ae970\": container with ID starting with 10472eec0cf914f64be6b2161059a6a76a16a4ca87702b7c19fc720d257ae970 not found: ID does not exist" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.944752 4807 scope.go:117] "RemoveContainer" containerID="d63ba4da3ad7c4c0a74d02cde067e5ac7cff5c6478f5bd7fce0928e0d27c71bf" Dec 02 20:23:48 crc kubenswrapper[4807]: E1202 20:23:48.946381 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d63ba4da3ad7c4c0a74d02cde067e5ac7cff5c6478f5bd7fce0928e0d27c71bf\": container with ID starting with d63ba4da3ad7c4c0a74d02cde067e5ac7cff5c6478f5bd7fce0928e0d27c71bf not found: ID does not exist" containerID="d63ba4da3ad7c4c0a74d02cde067e5ac7cff5c6478f5bd7fce0928e0d27c71bf" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.946437 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d63ba4da3ad7c4c0a74d02cde067e5ac7cff5c6478f5bd7fce0928e0d27c71bf"} err="failed to get container status \"d63ba4da3ad7c4c0a74d02cde067e5ac7cff5c6478f5bd7fce0928e0d27c71bf\": rpc error: code = NotFound desc = could not find container \"d63ba4da3ad7c4c0a74d02cde067e5ac7cff5c6478f5bd7fce0928e0d27c71bf\": container with ID starting with d63ba4da3ad7c4c0a74d02cde067e5ac7cff5c6478f5bd7fce0928e0d27c71bf not found: ID does not exist" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.946475 4807 scope.go:117] "RemoveContainer" containerID="202806e184213dd241b7c52d6cf6d04077fe00fd99e4b87c811d86e26de0d1c1" Dec 02 20:23:48 crc kubenswrapper[4807]: E1202 20:23:48.946818 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"202806e184213dd241b7c52d6cf6d04077fe00fd99e4b87c811d86e26de0d1c1\": container with ID starting with 202806e184213dd241b7c52d6cf6d04077fe00fd99e4b87c811d86e26de0d1c1 not found: ID does not exist" containerID="202806e184213dd241b7c52d6cf6d04077fe00fd99e4b87c811d86e26de0d1c1" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.946849 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"202806e184213dd241b7c52d6cf6d04077fe00fd99e4b87c811d86e26de0d1c1"} err="failed to get container status \"202806e184213dd241b7c52d6cf6d04077fe00fd99e4b87c811d86e26de0d1c1\": rpc error: code = NotFound desc = could not find container \"202806e184213dd241b7c52d6cf6d04077fe00fd99e4b87c811d86e26de0d1c1\": container with ID starting with 202806e184213dd241b7c52d6cf6d04077fe00fd99e4b87c811d86e26de0d1c1 not found: ID does not exist" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.995192 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="868416ad-83ed-4fa1-897b-14f454303f2c" path="/var/lib/kubelet/pods/868416ad-83ed-4fa1-897b-14f454303f2c/volumes" Dec 02 20:23:48 crc kubenswrapper[4807]: I1202 20:23:48.996177 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9a435bd-d058-4ef5-a457-e23bcccfb168" path="/var/lib/kubelet/pods/b9a435bd-d058-4ef5-a457-e23bcccfb168/volumes" Dec 02 20:23:49 crc kubenswrapper[4807]: I1202 20:23:49.238277 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4j2hj"] Dec 02 20:23:49 crc kubenswrapper[4807]: W1202 20:23:49.264307 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11461dcc_2c59_4b98_b246_5908dca9a42b.slice/crio-2ff14c1b19a107fd4c3be772cbf49cb2a07fd95f3b4e0ea2bddd770abb5f85fa WatchSource:0}: Error finding container 2ff14c1b19a107fd4c3be772cbf49cb2a07fd95f3b4e0ea2bddd770abb5f85fa: Status 404 returned error can't find the container with id 2ff14c1b19a107fd4c3be772cbf49cb2a07fd95f3b4e0ea2bddd770abb5f85fa Dec 02 20:23:49 crc kubenswrapper[4807]: I1202 20:23:49.761839 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 20:23:49 crc kubenswrapper[4807]: I1202 20:23:49.763171 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 20:23:49 crc kubenswrapper[4807]: I1202 20:23:49.768886 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 20:23:49 crc kubenswrapper[4807]: I1202 20:23:49.770099 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 20:23:49 crc kubenswrapper[4807]: I1202 20:23:49.780256 4807 generic.go:334] "Generic (PLEG): container finished" podID="11461dcc-2c59-4b98-b246-5908dca9a42b" containerID="c1bec5bf48c00f954b3ac21a24e3c2b21ae22784767745c914af6e75791790d8" exitCode=0 Dec 02 20:23:49 crc kubenswrapper[4807]: I1202 20:23:49.780347 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j2hj" event={"ID":"11461dcc-2c59-4b98-b246-5908dca9a42b","Type":"ContainerDied","Data":"c1bec5bf48c00f954b3ac21a24e3c2b21ae22784767745c914af6e75791790d8"} Dec 02 20:23:49 crc kubenswrapper[4807]: I1202 20:23:49.780390 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j2hj" event={"ID":"11461dcc-2c59-4b98-b246-5908dca9a42b","Type":"ContainerStarted","Data":"2ff14c1b19a107fd4c3be772cbf49cb2a07fd95f3b4e0ea2bddd770abb5f85fa"} Dec 02 20:23:49 crc kubenswrapper[4807]: I1202 20:23:49.787435 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb","Type":"ContainerStarted","Data":"27935cb51906c67c872bd5343f6504634b420cbf584d56a54c936902a855d0c9"} Dec 02 20:23:49 crc kubenswrapper[4807]: I1202 20:23:49.787503 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fc1416d8-8665-48f9-ad43-b2e16b6a5ecb","Type":"ContainerStarted","Data":"e0d0e5091c326080863e1f5fd4a5054c3dc765b986d67e961e3689513eb934d6"} Dec 02 20:23:49 crc kubenswrapper[4807]: I1202 20:23:49.839009 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.838977934 podStartE2EDuration="2.838977934s" podCreationTimestamp="2025-12-02 20:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:23:49.825625806 +0000 UTC m=+1565.126533301" watchObservedRunningTime="2025-12-02 20:23:49.838977934 +0000 UTC m=+1565.139885429" Dec 02 20:23:50 crc kubenswrapper[4807]: I1202 20:23:50.822071 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j2hj" event={"ID":"11461dcc-2c59-4b98-b246-5908dca9a42b","Type":"ContainerStarted","Data":"649306953ad05f24efb11e794d7dd394436fe4ef6e867161d4d9480d65bd6601"} Dec 02 20:23:50 crc kubenswrapper[4807]: I1202 20:23:50.822626 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 20:23:50 crc kubenswrapper[4807]: I1202 20:23:50.831626 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.079580 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vl9gx"] Dec 02 20:23:51 crc kubenswrapper[4807]: E1202 20:23:51.080263 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868416ad-83ed-4fa1-897b-14f454303f2c" containerName="extract-content" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.080283 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="868416ad-83ed-4fa1-897b-14f454303f2c" containerName="extract-content" Dec 02 20:23:51 crc kubenswrapper[4807]: E1202 20:23:51.080295 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868416ad-83ed-4fa1-897b-14f454303f2c" containerName="registry-server" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.080305 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="868416ad-83ed-4fa1-897b-14f454303f2c" containerName="registry-server" Dec 02 20:23:51 crc kubenswrapper[4807]: E1202 20:23:51.080316 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868416ad-83ed-4fa1-897b-14f454303f2c" containerName="extract-utilities" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.080325 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="868416ad-83ed-4fa1-897b-14f454303f2c" containerName="extract-utilities" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.080560 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="868416ad-83ed-4fa1-897b-14f454303f2c" containerName="registry-server" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.081944 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.090029 4807 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod61a9f7ef-7bd0-49c9-8cd7-3eca252dca72"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod61a9f7ef-7bd0-49c9-8cd7-3eca252dca72] : Timed out while waiting for systemd to remove kubepods-besteffort-pod61a9f7ef_7bd0_49c9_8cd7_3eca252dca72.slice" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.104010 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vl9gx"] Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.247764 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-config\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.247824 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.247863 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.247914 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmfg7\" (UniqueName: \"kubernetes.io/projected/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-kube-api-access-dmfg7\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.247947 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.248632 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.352053 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-config\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.352578 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.352624 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.352680 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmfg7\" (UniqueName: \"kubernetes.io/projected/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-kube-api-access-dmfg7\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.352747 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.353047 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.353628 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-config\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.353790 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.353988 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.354212 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.355199 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.387531 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmfg7\" (UniqueName: \"kubernetes.io/projected/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-kube-api-access-dmfg7\") pod \"dnsmasq-dns-cd5cbd7b9-vl9gx\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.422683 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.860253 4807 generic.go:334] "Generic (PLEG): container finished" podID="11461dcc-2c59-4b98-b246-5908dca9a42b" containerID="649306953ad05f24efb11e794d7dd394436fe4ef6e867161d4d9480d65bd6601" exitCode=0 Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.860473 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j2hj" event={"ID":"11461dcc-2c59-4b98-b246-5908dca9a42b","Type":"ContainerDied","Data":"649306953ad05f24efb11e794d7dd394436fe4ef6e867161d4d9480d65bd6601"} Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.860991 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j2hj" event={"ID":"11461dcc-2c59-4b98-b246-5908dca9a42b","Type":"ContainerStarted","Data":"9ea34be6ca91f9578d0b29737db48edd7fe43b390b84448670c82803b3df2246"} Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.893505 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4j2hj" podStartSLOduration=2.46490047 podStartE2EDuration="3.892908631s" podCreationTimestamp="2025-12-02 20:23:48 +0000 UTC" firstStartedPulling="2025-12-02 20:23:49.786843011 +0000 UTC m=+1565.087750526" lastFinishedPulling="2025-12-02 20:23:51.214851192 +0000 UTC m=+1566.515758687" observedRunningTime="2025-12-02 20:23:51.881566693 +0000 UTC m=+1567.182474188" watchObservedRunningTime="2025-12-02 20:23:51.892908631 +0000 UTC m=+1567.193816126" Dec 02 20:23:51 crc kubenswrapper[4807]: I1202 20:23:51.934572 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vl9gx"] Dec 02 20:23:51 crc kubenswrapper[4807]: W1202 20:23:51.947080 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5c801bb_f6e2_4c8f_855e_eb5eb060d6b1.slice/crio-9621e06487dfd855b1de8f2e360b0f888af0cbace331c994d8edfe154214e216 WatchSource:0}: Error finding container 9621e06487dfd855b1de8f2e360b0f888af0cbace331c994d8edfe154214e216: Status 404 returned error can't find the container with id 9621e06487dfd855b1de8f2e360b0f888af0cbace331c994d8edfe154214e216 Dec 02 20:23:52 crc kubenswrapper[4807]: I1202 20:23:52.874917 4807 generic.go:334] "Generic (PLEG): container finished" podID="b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1" containerID="fee7d5fb73660da659de84b8f792576de6fb7e5b289f4b5f528ff02021381118" exitCode=0 Dec 02 20:23:52 crc kubenswrapper[4807]: I1202 20:23:52.875130 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" event={"ID":"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1","Type":"ContainerDied","Data":"fee7d5fb73660da659de84b8f792576de6fb7e5b289f4b5f528ff02021381118"} Dec 02 20:23:52 crc kubenswrapper[4807]: I1202 20:23:52.875685 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" event={"ID":"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1","Type":"ContainerStarted","Data":"9621e06487dfd855b1de8f2e360b0f888af0cbace331c994d8edfe154214e216"} Dec 02 20:23:53 crc kubenswrapper[4807]: I1202 20:23:53.212704 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:53 crc kubenswrapper[4807]: I1202 20:23:53.719422 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:23:53 crc kubenswrapper[4807]: I1202 20:23:53.720032 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerName="ceilometer-central-agent" containerID="cri-o://c48d220d1a579e4d460d431b3f821e6ca2f1ef17bf31930261c852593e07c3b4" gracePeriod=30 Dec 02 20:23:53 crc kubenswrapper[4807]: I1202 20:23:53.720207 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerName="sg-core" containerID="cri-o://86ce352ca57a0e1c13ded632c42c78b3fec7d98cb0a269e4c4f8386848af1d5b" gracePeriod=30 Dec 02 20:23:53 crc kubenswrapper[4807]: I1202 20:23:53.720283 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerName="proxy-httpd" containerID="cri-o://cec519dee2c86c2e9ee18316c11005659f23b98aa15463ba5e587aea3aa85aaf" gracePeriod=30 Dec 02 20:23:53 crc kubenswrapper[4807]: I1202 20:23:53.720408 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerName="ceilometer-notification-agent" containerID="cri-o://f4ea173ca5e5f4f20e07bd31443d4f2627e605f815e89547782994797d03bd4e" gracePeriod=30 Dec 02 20:23:53 crc kubenswrapper[4807]: I1202 20:23:53.900213 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" event={"ID":"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1","Type":"ContainerStarted","Data":"c52f1c38d3609bfa0a3f3212cbfd824968d703e5eef1630e447e8792c09b7900"} Dec 02 20:23:53 crc kubenswrapper[4807]: I1202 20:23:53.900307 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:23:53 crc kubenswrapper[4807]: I1202 20:23:53.910010 4807 generic.go:334] "Generic (PLEG): container finished" podID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerID="86ce352ca57a0e1c13ded632c42c78b3fec7d98cb0a269e4c4f8386848af1d5b" exitCode=2 Dec 02 20:23:53 crc kubenswrapper[4807]: I1202 20:23:53.910095 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e9a71e7-d4c0-48ce-94bd-e5563864adfa","Type":"ContainerDied","Data":"86ce352ca57a0e1c13ded632c42c78b3fec7d98cb0a269e4c4f8386848af1d5b"} Dec 02 20:23:53 crc kubenswrapper[4807]: I1202 20:23:53.940156 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" podStartSLOduration=2.940127609 podStartE2EDuration="2.940127609s" podCreationTimestamp="2025-12-02 20:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:23:53.923512014 +0000 UTC m=+1569.224419509" watchObservedRunningTime="2025-12-02 20:23:53.940127609 +0000 UTC m=+1569.241035094" Dec 02 20:23:54 crc kubenswrapper[4807]: E1202 20:23:54.009354 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e9a71e7_d4c0_48ce_94bd_e5563864adfa.slice/crio-conmon-cec519dee2c86c2e9ee18316c11005659f23b98aa15463ba5e587aea3aa85aaf.scope\": RecentStats: unable to find data in memory cache]" Dec 02 20:23:54 crc kubenswrapper[4807]: I1202 20:23:54.032188 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:23:54 crc kubenswrapper[4807]: I1202 20:23:54.032591 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e02b468-5c68-4284-8a76-2be6380aeb8b" containerName="nova-api-api" containerID="cri-o://026cdc95ff849bfa4ccf3a985e3e7352c475aca8ed96f7ef2cb4676ced9dfa85" gracePeriod=30 Dec 02 20:23:54 crc kubenswrapper[4807]: I1202 20:23:54.032556 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e02b468-5c68-4284-8a76-2be6380aeb8b" containerName="nova-api-log" containerID="cri-o://6a1c16a32af0158ac6b1eed73892096ea38f9054a47c43e0bbd0f999eddade21" gracePeriod=30 Dec 02 20:23:54 crc kubenswrapper[4807]: I1202 20:23:54.924794 4807 generic.go:334] "Generic (PLEG): container finished" podID="0e02b468-5c68-4284-8a76-2be6380aeb8b" containerID="6a1c16a32af0158ac6b1eed73892096ea38f9054a47c43e0bbd0f999eddade21" exitCode=143 Dec 02 20:23:54 crc kubenswrapper[4807]: I1202 20:23:54.925655 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e02b468-5c68-4284-8a76-2be6380aeb8b","Type":"ContainerDied","Data":"6a1c16a32af0158ac6b1eed73892096ea38f9054a47c43e0bbd0f999eddade21"} Dec 02 20:23:54 crc kubenswrapper[4807]: I1202 20:23:54.928382 4807 generic.go:334] "Generic (PLEG): container finished" podID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerID="cec519dee2c86c2e9ee18316c11005659f23b98aa15463ba5e587aea3aa85aaf" exitCode=0 Dec 02 20:23:54 crc kubenswrapper[4807]: I1202 20:23:54.928409 4807 generic.go:334] "Generic (PLEG): container finished" podID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerID="c48d220d1a579e4d460d431b3f821e6ca2f1ef17bf31930261c852593e07c3b4" exitCode=0 Dec 02 20:23:54 crc kubenswrapper[4807]: I1202 20:23:54.929887 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e9a71e7-d4c0-48ce-94bd-e5563864adfa","Type":"ContainerDied","Data":"cec519dee2c86c2e9ee18316c11005659f23b98aa15463ba5e587aea3aa85aaf"} Dec 02 20:23:54 crc kubenswrapper[4807]: I1202 20:23:54.929920 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e9a71e7-d4c0-48ce-94bd-e5563864adfa","Type":"ContainerDied","Data":"c48d220d1a579e4d460d431b3f821e6ca2f1ef17bf31930261c852593e07c3b4"} Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.774696 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.860460 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e02b468-5c68-4284-8a76-2be6380aeb8b-logs\") pod \"0e02b468-5c68-4284-8a76-2be6380aeb8b\" (UID: \"0e02b468-5c68-4284-8a76-2be6380aeb8b\") " Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.860879 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e02b468-5c68-4284-8a76-2be6380aeb8b-combined-ca-bundle\") pod \"0e02b468-5c68-4284-8a76-2be6380aeb8b\" (UID: \"0e02b468-5c68-4284-8a76-2be6380aeb8b\") " Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.860989 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e02b468-5c68-4284-8a76-2be6380aeb8b-config-data\") pod \"0e02b468-5c68-4284-8a76-2be6380aeb8b\" (UID: \"0e02b468-5c68-4284-8a76-2be6380aeb8b\") " Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.861066 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4z7q\" (UniqueName: \"kubernetes.io/projected/0e02b468-5c68-4284-8a76-2be6380aeb8b-kube-api-access-q4z7q\") pod \"0e02b468-5c68-4284-8a76-2be6380aeb8b\" (UID: \"0e02b468-5c68-4284-8a76-2be6380aeb8b\") " Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.861589 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e02b468-5c68-4284-8a76-2be6380aeb8b-logs" (OuterVolumeSpecName: "logs") pod "0e02b468-5c68-4284-8a76-2be6380aeb8b" (UID: "0e02b468-5c68-4284-8a76-2be6380aeb8b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.862198 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e02b468-5c68-4284-8a76-2be6380aeb8b-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.870305 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e02b468-5c68-4284-8a76-2be6380aeb8b-kube-api-access-q4z7q" (OuterVolumeSpecName: "kube-api-access-q4z7q") pod "0e02b468-5c68-4284-8a76-2be6380aeb8b" (UID: "0e02b468-5c68-4284-8a76-2be6380aeb8b"). InnerVolumeSpecName "kube-api-access-q4z7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.878508 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.919058 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e02b468-5c68-4284-8a76-2be6380aeb8b-config-data" (OuterVolumeSpecName: "config-data") pod "0e02b468-5c68-4284-8a76-2be6380aeb8b" (UID: "0e02b468-5c68-4284-8a76-2be6380aeb8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.921263 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e02b468-5c68-4284-8a76-2be6380aeb8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e02b468-5c68-4284-8a76-2be6380aeb8b" (UID: "0e02b468-5c68-4284-8a76-2be6380aeb8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.964054 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-sg-core-conf-yaml\") pod \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.964149 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-scripts\") pod \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.964219 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-run-httpd\") pod \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.964303 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-config-data\") pod \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.964346 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-combined-ca-bundle\") pod \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.964396 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-ceilometer-tls-certs\") pod \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.964439 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-log-httpd\") pod \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.964504 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzdfc\" (UniqueName: \"kubernetes.io/projected/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-kube-api-access-hzdfc\") pod \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\" (UID: \"6e9a71e7-d4c0-48ce-94bd-e5563864adfa\") " Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.965364 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e02b468-5c68-4284-8a76-2be6380aeb8b-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.965390 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4z7q\" (UniqueName: \"kubernetes.io/projected/0e02b468-5c68-4284-8a76-2be6380aeb8b-kube-api-access-q4z7q\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.965406 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e02b468-5c68-4284-8a76-2be6380aeb8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.967062 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6e9a71e7-d4c0-48ce-94bd-e5563864adfa" (UID: "6e9a71e7-d4c0-48ce-94bd-e5563864adfa"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.967497 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6e9a71e7-d4c0-48ce-94bd-e5563864adfa" (UID: "6e9a71e7-d4c0-48ce-94bd-e5563864adfa"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.972640 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-scripts" (OuterVolumeSpecName: "scripts") pod "6e9a71e7-d4c0-48ce-94bd-e5563864adfa" (UID: "6e9a71e7-d4c0-48ce-94bd-e5563864adfa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:57 crc kubenswrapper[4807]: I1202 20:23:57.983230 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-kube-api-access-hzdfc" (OuterVolumeSpecName: "kube-api-access-hzdfc") pod "6e9a71e7-d4c0-48ce-94bd-e5563864adfa" (UID: "6e9a71e7-d4c0-48ce-94bd-e5563864adfa"). InnerVolumeSpecName "kube-api-access-hzdfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.006731 4807 generic.go:334] "Generic (PLEG): container finished" podID="0e02b468-5c68-4284-8a76-2be6380aeb8b" containerID="026cdc95ff849bfa4ccf3a985e3e7352c475aca8ed96f7ef2cb4676ced9dfa85" exitCode=0 Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.007185 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e02b468-5c68-4284-8a76-2be6380aeb8b","Type":"ContainerDied","Data":"026cdc95ff849bfa4ccf3a985e3e7352c475aca8ed96f7ef2cb4676ced9dfa85"} Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.007239 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e02b468-5c68-4284-8a76-2be6380aeb8b","Type":"ContainerDied","Data":"8f8fbbfdaac2aa162423c5f413086a96bae3f27b4f26f21e453b1c6a8827684e"} Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.007261 4807 scope.go:117] "RemoveContainer" containerID="026cdc95ff849bfa4ccf3a985e3e7352c475aca8ed96f7ef2cb4676ced9dfa85" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.007451 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.014261 4807 generic.go:334] "Generic (PLEG): container finished" podID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerID="f4ea173ca5e5f4f20e07bd31443d4f2627e605f815e89547782994797d03bd4e" exitCode=0 Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.014333 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e9a71e7-d4c0-48ce-94bd-e5563864adfa","Type":"ContainerDied","Data":"f4ea173ca5e5f4f20e07bd31443d4f2627e605f815e89547782994797d03bd4e"} Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.014373 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e9a71e7-d4c0-48ce-94bd-e5563864adfa","Type":"ContainerDied","Data":"a0cc5110af2ae31cc80f2eed787494835c038a0bce8a301e38c042513c175492"} Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.014452 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.043092 4807 scope.go:117] "RemoveContainer" containerID="6a1c16a32af0158ac6b1eed73892096ea38f9054a47c43e0bbd0f999eddade21" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.064973 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6e9a71e7-d4c0-48ce-94bd-e5563864adfa" (UID: "6e9a71e7-d4c0-48ce-94bd-e5563864adfa"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.069098 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6e9a71e7-d4c0-48ce-94bd-e5563864adfa" (UID: "6e9a71e7-d4c0-48ce-94bd-e5563864adfa"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.067638 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzdfc\" (UniqueName: \"kubernetes.io/projected/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-kube-api-access-hzdfc\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.070797 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.070869 4807 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.070934 4807 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.085361 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.099154 4807 scope.go:117] "RemoveContainer" containerID="026cdc95ff849bfa4ccf3a985e3e7352c475aca8ed96f7ef2cb4676ced9dfa85" Dec 02 20:23:58 crc kubenswrapper[4807]: E1202 20:23:58.108104 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"026cdc95ff849bfa4ccf3a985e3e7352c475aca8ed96f7ef2cb4676ced9dfa85\": container with ID starting with 026cdc95ff849bfa4ccf3a985e3e7352c475aca8ed96f7ef2cb4676ced9dfa85 not found: ID does not exist" containerID="026cdc95ff849bfa4ccf3a985e3e7352c475aca8ed96f7ef2cb4676ced9dfa85" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.108547 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"026cdc95ff849bfa4ccf3a985e3e7352c475aca8ed96f7ef2cb4676ced9dfa85"} err="failed to get container status \"026cdc95ff849bfa4ccf3a985e3e7352c475aca8ed96f7ef2cb4676ced9dfa85\": rpc error: code = NotFound desc = could not find container \"026cdc95ff849bfa4ccf3a985e3e7352c475aca8ed96f7ef2cb4676ced9dfa85\": container with ID starting with 026cdc95ff849bfa4ccf3a985e3e7352c475aca8ed96f7ef2cb4676ced9dfa85 not found: ID does not exist" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.108737 4807 scope.go:117] "RemoveContainer" containerID="6a1c16a32af0158ac6b1eed73892096ea38f9054a47c43e0bbd0f999eddade21" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.110899 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:23:58 crc kubenswrapper[4807]: E1202 20:23:58.111137 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a1c16a32af0158ac6b1eed73892096ea38f9054a47c43e0bbd0f999eddade21\": container with ID starting with 6a1c16a32af0158ac6b1eed73892096ea38f9054a47c43e0bbd0f999eddade21 not found: ID does not exist" containerID="6a1c16a32af0158ac6b1eed73892096ea38f9054a47c43e0bbd0f999eddade21" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.111258 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1c16a32af0158ac6b1eed73892096ea38f9054a47c43e0bbd0f999eddade21"} err="failed to get container status \"6a1c16a32af0158ac6b1eed73892096ea38f9054a47c43e0bbd0f999eddade21\": rpc error: code = NotFound desc = could not find container \"6a1c16a32af0158ac6b1eed73892096ea38f9054a47c43e0bbd0f999eddade21\": container with ID starting with 6a1c16a32af0158ac6b1eed73892096ea38f9054a47c43e0bbd0f999eddade21 not found: ID does not exist" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.111352 4807 scope.go:117] "RemoveContainer" containerID="cec519dee2c86c2e9ee18316c11005659f23b98aa15463ba5e587aea3aa85aaf" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.122044 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 20:23:58 crc kubenswrapper[4807]: E1202 20:23:58.122960 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerName="ceilometer-notification-agent" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.122986 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerName="ceilometer-notification-agent" Dec 02 20:23:58 crc kubenswrapper[4807]: E1202 20:23:58.123025 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e02b468-5c68-4284-8a76-2be6380aeb8b" containerName="nova-api-api" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.123035 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e02b468-5c68-4284-8a76-2be6380aeb8b" containerName="nova-api-api" Dec 02 20:23:58 crc kubenswrapper[4807]: E1202 20:23:58.123054 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerName="sg-core" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.123063 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerName="sg-core" Dec 02 20:23:58 crc kubenswrapper[4807]: E1202 20:23:58.123085 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerName="ceilometer-central-agent" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.123093 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerName="ceilometer-central-agent" Dec 02 20:23:58 crc kubenswrapper[4807]: E1202 20:23:58.123118 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e02b468-5c68-4284-8a76-2be6380aeb8b" containerName="nova-api-log" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.123126 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e02b468-5c68-4284-8a76-2be6380aeb8b" containerName="nova-api-log" Dec 02 20:23:58 crc kubenswrapper[4807]: E1202 20:23:58.123152 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerName="proxy-httpd" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.123161 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerName="proxy-httpd" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.123466 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e02b468-5c68-4284-8a76-2be6380aeb8b" containerName="nova-api-api" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.123487 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerName="proxy-httpd" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.123510 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerName="ceilometer-notification-agent" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.123527 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerName="ceilometer-central-agent" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.123542 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" containerName="sg-core" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.123563 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e02b468-5c68-4284-8a76-2be6380aeb8b" containerName="nova-api-log" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.125317 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.131463 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.134073 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.135737 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.139586 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.140806 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e9a71e7-d4c0-48ce-94bd-e5563864adfa" (UID: "6e9a71e7-d4c0-48ce-94bd-e5563864adfa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.167362 4807 scope.go:117] "RemoveContainer" containerID="86ce352ca57a0e1c13ded632c42c78b3fec7d98cb0a269e4c4f8386848af1d5b" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.173598 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.180946 4807 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.180969 4807 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.210815 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-config-data" (OuterVolumeSpecName: "config-data") pod "6e9a71e7-d4c0-48ce-94bd-e5563864adfa" (UID: "6e9a71e7-d4c0-48ce-94bd-e5563864adfa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.211772 4807 scope.go:117] "RemoveContainer" containerID="f4ea173ca5e5f4f20e07bd31443d4f2627e605f815e89547782994797d03bd4e" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.216074 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.236267 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.240413 4807 scope.go:117] "RemoveContainer" containerID="c48d220d1a579e4d460d431b3f821e6ca2f1ef17bf31930261c852593e07c3b4" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.283768 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-logs\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.283904 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.283947 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-public-tls-certs\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.284077 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26tms\" (UniqueName: \"kubernetes.io/projected/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-kube-api-access-26tms\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.284186 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-config-data\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.284212 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.286064 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e9a71e7-d4c0-48ce-94bd-e5563864adfa-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.293762 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.293860 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.359748 4807 scope.go:117] "RemoveContainer" containerID="cec519dee2c86c2e9ee18316c11005659f23b98aa15463ba5e587aea3aa85aaf" Dec 02 20:23:58 crc kubenswrapper[4807]: E1202 20:23:58.360359 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec519dee2c86c2e9ee18316c11005659f23b98aa15463ba5e587aea3aa85aaf\": container with ID starting with cec519dee2c86c2e9ee18316c11005659f23b98aa15463ba5e587aea3aa85aaf not found: ID does not exist" containerID="cec519dee2c86c2e9ee18316c11005659f23b98aa15463ba5e587aea3aa85aaf" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.360400 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec519dee2c86c2e9ee18316c11005659f23b98aa15463ba5e587aea3aa85aaf"} err="failed to get container status \"cec519dee2c86c2e9ee18316c11005659f23b98aa15463ba5e587aea3aa85aaf\": rpc error: code = NotFound desc = could not find container \"cec519dee2c86c2e9ee18316c11005659f23b98aa15463ba5e587aea3aa85aaf\": container with ID starting with cec519dee2c86c2e9ee18316c11005659f23b98aa15463ba5e587aea3aa85aaf not found: ID does not exist" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.360425 4807 scope.go:117] "RemoveContainer" containerID="86ce352ca57a0e1c13ded632c42c78b3fec7d98cb0a269e4c4f8386848af1d5b" Dec 02 20:23:58 crc kubenswrapper[4807]: E1202 20:23:58.360614 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86ce352ca57a0e1c13ded632c42c78b3fec7d98cb0a269e4c4f8386848af1d5b\": container with ID starting with 86ce352ca57a0e1c13ded632c42c78b3fec7d98cb0a269e4c4f8386848af1d5b not found: ID does not exist" containerID="86ce352ca57a0e1c13ded632c42c78b3fec7d98cb0a269e4c4f8386848af1d5b" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.360637 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86ce352ca57a0e1c13ded632c42c78b3fec7d98cb0a269e4c4f8386848af1d5b"} err="failed to get container status \"86ce352ca57a0e1c13ded632c42c78b3fec7d98cb0a269e4c4f8386848af1d5b\": rpc error: code = NotFound desc = could not find container \"86ce352ca57a0e1c13ded632c42c78b3fec7d98cb0a269e4c4f8386848af1d5b\": container with ID starting with 86ce352ca57a0e1c13ded632c42c78b3fec7d98cb0a269e4c4f8386848af1d5b not found: ID does not exist" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.360651 4807 scope.go:117] "RemoveContainer" containerID="f4ea173ca5e5f4f20e07bd31443d4f2627e605f815e89547782994797d03bd4e" Dec 02 20:23:58 crc kubenswrapper[4807]: E1202 20:23:58.360852 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4ea173ca5e5f4f20e07bd31443d4f2627e605f815e89547782994797d03bd4e\": container with ID starting with f4ea173ca5e5f4f20e07bd31443d4f2627e605f815e89547782994797d03bd4e not found: ID does not exist" containerID="f4ea173ca5e5f4f20e07bd31443d4f2627e605f815e89547782994797d03bd4e" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.360874 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ea173ca5e5f4f20e07bd31443d4f2627e605f815e89547782994797d03bd4e"} err="failed to get container status \"f4ea173ca5e5f4f20e07bd31443d4f2627e605f815e89547782994797d03bd4e\": rpc error: code = NotFound desc = could not find container \"f4ea173ca5e5f4f20e07bd31443d4f2627e605f815e89547782994797d03bd4e\": container with ID starting with f4ea173ca5e5f4f20e07bd31443d4f2627e605f815e89547782994797d03bd4e not found: ID does not exist" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.360886 4807 scope.go:117] "RemoveContainer" containerID="c48d220d1a579e4d460d431b3f821e6ca2f1ef17bf31930261c852593e07c3b4" Dec 02 20:23:58 crc kubenswrapper[4807]: E1202 20:23:58.361063 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c48d220d1a579e4d460d431b3f821e6ca2f1ef17bf31930261c852593e07c3b4\": container with ID starting with c48d220d1a579e4d460d431b3f821e6ca2f1ef17bf31930261c852593e07c3b4 not found: ID does not exist" containerID="c48d220d1a579e4d460d431b3f821e6ca2f1ef17bf31930261c852593e07c3b4" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.361080 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c48d220d1a579e4d460d431b3f821e6ca2f1ef17bf31930261c852593e07c3b4"} err="failed to get container status \"c48d220d1a579e4d460d431b3f821e6ca2f1ef17bf31930261c852593e07c3b4\": rpc error: code = NotFound desc = could not find container \"c48d220d1a579e4d460d431b3f821e6ca2f1ef17bf31930261c852593e07c3b4\": container with ID starting with c48d220d1a579e4d460d431b3f821e6ca2f1ef17bf31930261c852593e07c3b4 not found: ID does not exist" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.369695 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.388317 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-logs\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.388468 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.388532 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-public-tls-certs\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.388617 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26tms\" (UniqueName: \"kubernetes.io/projected/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-kube-api-access-26tms\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.388791 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-config-data\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.388834 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.389018 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-logs\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.393196 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.393387 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.394234 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-config-data\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.396577 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.400262 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-public-tls-certs\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.420831 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.421822 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26tms\" (UniqueName: \"kubernetes.io/projected/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-kube-api-access-26tms\") pod \"nova-api-0\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.423783 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.429422 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.429663 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.429813 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.463750 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.469674 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.598096 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d90a40-6c28-4353-91b3-87e966ad1ac7-config-data\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.602634 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10d90a40-6c28-4353-91b3-87e966ad1ac7-log-httpd\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.602684 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk92r\" (UniqueName: \"kubernetes.io/projected/10d90a40-6c28-4353-91b3-87e966ad1ac7-kube-api-access-xk92r\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.602960 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d90a40-6c28-4353-91b3-87e966ad1ac7-scripts\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.602987 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/10d90a40-6c28-4353-91b3-87e966ad1ac7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.603083 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d90a40-6c28-4353-91b3-87e966ad1ac7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.603239 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10d90a40-6c28-4353-91b3-87e966ad1ac7-run-httpd\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.603357 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10d90a40-6c28-4353-91b3-87e966ad1ac7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.692351 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.692794 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.705148 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d90a40-6c28-4353-91b3-87e966ad1ac7-config-data\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.705271 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10d90a40-6c28-4353-91b3-87e966ad1ac7-log-httpd\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.705305 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk92r\" (UniqueName: \"kubernetes.io/projected/10d90a40-6c28-4353-91b3-87e966ad1ac7-kube-api-access-xk92r\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.705364 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d90a40-6c28-4353-91b3-87e966ad1ac7-scripts\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.705388 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/10d90a40-6c28-4353-91b3-87e966ad1ac7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.705430 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d90a40-6c28-4353-91b3-87e966ad1ac7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.705516 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10d90a40-6c28-4353-91b3-87e966ad1ac7-run-httpd\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.705565 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10d90a40-6c28-4353-91b3-87e966ad1ac7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.709866 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10d90a40-6c28-4353-91b3-87e966ad1ac7-log-httpd\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.710488 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10d90a40-6c28-4353-91b3-87e966ad1ac7-run-httpd\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.718338 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10d90a40-6c28-4353-91b3-87e966ad1ac7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.718641 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/10d90a40-6c28-4353-91b3-87e966ad1ac7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.719399 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10d90a40-6c28-4353-91b3-87e966ad1ac7-config-data\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.721814 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10d90a40-6c28-4353-91b3-87e966ad1ac7-scripts\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.722087 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10d90a40-6c28-4353-91b3-87e966ad1ac7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.730379 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk92r\" (UniqueName: \"kubernetes.io/projected/10d90a40-6c28-4353-91b3-87e966ad1ac7-kube-api-access-xk92r\") pod \"ceilometer-0\" (UID: \"10d90a40-6c28-4353-91b3-87e966ad1ac7\") " pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.765268 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.907529 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.996868 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e02b468-5c68-4284-8a76-2be6380aeb8b" path="/var/lib/kubelet/pods/0e02b468-5c68-4284-8a76-2be6380aeb8b/volumes" Dec 02 20:23:58 crc kubenswrapper[4807]: I1202 20:23:58.998393 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e9a71e7-d4c0-48ce-94bd-e5563864adfa" path="/var/lib/kubelet/pods/6e9a71e7-d4c0-48ce-94bd-e5563864adfa/volumes" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.002512 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.058157 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371","Type":"ContainerStarted","Data":"8a1c3e17bf3d9452eb837c46884f4e42c81bd255659c4c331a754b590d8f19c3"} Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.094087 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.192821 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.288960 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4j2hj"] Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.417854 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-fzzjd"] Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.420390 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fzzjd" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.428994 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fzzjd"] Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.430636 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.430968 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.467392 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 20:23:59 crc kubenswrapper[4807]: W1202 20:23:59.476511 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10d90a40_6c28_4353_91b3_87e966ad1ac7.slice/crio-f78d599096a74ae1e19807ed65e43ea5c23255b961310a9c8a3e3cac2e21f0f0 WatchSource:0}: Error finding container f78d599096a74ae1e19807ed65e43ea5c23255b961310a9c8a3e3cac2e21f0f0: Status 404 returned error can't find the container with id f78d599096a74ae1e19807ed65e43ea5c23255b961310a9c8a3e3cac2e21f0f0 Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.539240 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-scripts\") pod \"nova-cell1-cell-mapping-fzzjd\" (UID: \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\") " pod="openstack/nova-cell1-cell-mapping-fzzjd" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.539383 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znwhd\" (UniqueName: \"kubernetes.io/projected/dd0c1a2e-9558-4538-8e36-aa4def438cc1-kube-api-access-znwhd\") pod \"nova-cell1-cell-mapping-fzzjd\" (UID: \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\") " pod="openstack/nova-cell1-cell-mapping-fzzjd" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.539432 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-config-data\") pod \"nova-cell1-cell-mapping-fzzjd\" (UID: \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\") " pod="openstack/nova-cell1-cell-mapping-fzzjd" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.539450 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fzzjd\" (UID: \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\") " pod="openstack/nova-cell1-cell-mapping-fzzjd" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.642287 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-config-data\") pod \"nova-cell1-cell-mapping-fzzjd\" (UID: \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\") " pod="openstack/nova-cell1-cell-mapping-fzzjd" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.643200 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fzzjd\" (UID: \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\") " pod="openstack/nova-cell1-cell-mapping-fzzjd" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.643690 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-scripts\") pod \"nova-cell1-cell-mapping-fzzjd\" (UID: \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\") " pod="openstack/nova-cell1-cell-mapping-fzzjd" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.644190 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znwhd\" (UniqueName: \"kubernetes.io/projected/dd0c1a2e-9558-4538-8e36-aa4def438cc1-kube-api-access-znwhd\") pod \"nova-cell1-cell-mapping-fzzjd\" (UID: \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\") " pod="openstack/nova-cell1-cell-mapping-fzzjd" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.655791 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-scripts\") pod \"nova-cell1-cell-mapping-fzzjd\" (UID: \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\") " pod="openstack/nova-cell1-cell-mapping-fzzjd" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.655974 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fzzjd\" (UID: \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\") " pod="openstack/nova-cell1-cell-mapping-fzzjd" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.656233 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-config-data\") pod \"nova-cell1-cell-mapping-fzzjd\" (UID: \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\") " pod="openstack/nova-cell1-cell-mapping-fzzjd" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.672537 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znwhd\" (UniqueName: \"kubernetes.io/projected/dd0c1a2e-9558-4538-8e36-aa4def438cc1-kube-api-access-znwhd\") pod \"nova-cell1-cell-mapping-fzzjd\" (UID: \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\") " pod="openstack/nova-cell1-cell-mapping-fzzjd" Dec 02 20:23:59 crc kubenswrapper[4807]: I1202 20:23:59.770964 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fzzjd" Dec 02 20:24:00 crc kubenswrapper[4807]: I1202 20:24:00.158337 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10d90a40-6c28-4353-91b3-87e966ad1ac7","Type":"ContainerStarted","Data":"f78d599096a74ae1e19807ed65e43ea5c23255b961310a9c8a3e3cac2e21f0f0"} Dec 02 20:24:00 crc kubenswrapper[4807]: I1202 20:24:00.198781 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371","Type":"ContainerStarted","Data":"86e6601427412c7c3af59e9e2879f74c1dc371d5461e4542966ab7f4b570ad15"} Dec 02 20:24:00 crc kubenswrapper[4807]: I1202 20:24:00.198830 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371","Type":"ContainerStarted","Data":"f653b180249a9a4e8bde1e96a28376e953a014feaa1d94c86eafbf91ac6c9f9b"} Dec 02 20:24:00 crc kubenswrapper[4807]: I1202 20:24:00.272782 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.272752781 podStartE2EDuration="2.272752781s" podCreationTimestamp="2025-12-02 20:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:24:00.235545882 +0000 UTC m=+1575.536453377" watchObservedRunningTime="2025-12-02 20:24:00.272752781 +0000 UTC m=+1575.573660276" Dec 02 20:24:00 crc kubenswrapper[4807]: I1202 20:24:00.447187 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fzzjd"] Dec 02 20:24:01 crc kubenswrapper[4807]: I1202 20:24:01.215245 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10d90a40-6c28-4353-91b3-87e966ad1ac7","Type":"ContainerStarted","Data":"e9c8e680431b7cda21ae677bc8cf09f66806de42b872a3d0d62c770a9c84c87d"} Dec 02 20:24:01 crc kubenswrapper[4807]: I1202 20:24:01.218219 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fzzjd" event={"ID":"dd0c1a2e-9558-4538-8e36-aa4def438cc1","Type":"ContainerStarted","Data":"d8df39f6a448db0c81318fae0d805a9afcfff9aaedf924c7d66119bc0a63d1a9"} Dec 02 20:24:01 crc kubenswrapper[4807]: I1202 20:24:01.218303 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fzzjd" event={"ID":"dd0c1a2e-9558-4538-8e36-aa4def438cc1","Type":"ContainerStarted","Data":"8fd28d7b2017115274a2b1172b871f09c49dfdedf07e40d643c790d0d337d088"} Dec 02 20:24:01 crc kubenswrapper[4807]: I1202 20:24:01.218573 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4j2hj" podUID="11461dcc-2c59-4b98-b246-5908dca9a42b" containerName="registry-server" containerID="cri-o://9ea34be6ca91f9578d0b29737db48edd7fe43b390b84448670c82803b3df2246" gracePeriod=2 Dec 02 20:24:01 crc kubenswrapper[4807]: I1202 20:24:01.249508 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-fzzjd" podStartSLOduration=2.249479478 podStartE2EDuration="2.249479478s" podCreationTimestamp="2025-12-02 20:23:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:24:01.244343565 +0000 UTC m=+1576.545251050" watchObservedRunningTime="2025-12-02 20:24:01.249479478 +0000 UTC m=+1576.550386973" Dec 02 20:24:01 crc kubenswrapper[4807]: I1202 20:24:01.426012 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:24:01 crc kubenswrapper[4807]: I1202 20:24:01.533274 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-c7m8h"] Dec 02 20:24:01 crc kubenswrapper[4807]: I1202 20:24:01.534225 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" podUID="d2148eff-b1b3-45a5-9e9e-0769521c4cb7" containerName="dnsmasq-dns" containerID="cri-o://7533ea433989f8f19703bbedb950be9ebc6ced89e5bcd411e1fe4143884a943d" gracePeriod=10 Dec 02 20:24:01 crc kubenswrapper[4807]: I1202 20:24:01.931381 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.038700 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lszb7\" (UniqueName: \"kubernetes.io/projected/11461dcc-2c59-4b98-b246-5908dca9a42b-kube-api-access-lszb7\") pod \"11461dcc-2c59-4b98-b246-5908dca9a42b\" (UID: \"11461dcc-2c59-4b98-b246-5908dca9a42b\") " Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.038927 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11461dcc-2c59-4b98-b246-5908dca9a42b-catalog-content\") pod \"11461dcc-2c59-4b98-b246-5908dca9a42b\" (UID: \"11461dcc-2c59-4b98-b246-5908dca9a42b\") " Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.038984 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11461dcc-2c59-4b98-b246-5908dca9a42b-utilities\") pod \"11461dcc-2c59-4b98-b246-5908dca9a42b\" (UID: \"11461dcc-2c59-4b98-b246-5908dca9a42b\") " Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.044408 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11461dcc-2c59-4b98-b246-5908dca9a42b-utilities" (OuterVolumeSpecName: "utilities") pod "11461dcc-2c59-4b98-b246-5908dca9a42b" (UID: "11461dcc-2c59-4b98-b246-5908dca9a42b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.049906 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11461dcc-2c59-4b98-b246-5908dca9a42b-kube-api-access-lszb7" (OuterVolumeSpecName: "kube-api-access-lszb7") pod "11461dcc-2c59-4b98-b246-5908dca9a42b" (UID: "11461dcc-2c59-4b98-b246-5908dca9a42b"). InnerVolumeSpecName "kube-api-access-lszb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.060300 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11461dcc-2c59-4b98-b246-5908dca9a42b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11461dcc-2c59-4b98-b246-5908dca9a42b" (UID: "11461dcc-2c59-4b98-b246-5908dca9a42b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.149563 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lszb7\" (UniqueName: \"kubernetes.io/projected/11461dcc-2c59-4b98-b246-5908dca9a42b-kube-api-access-lszb7\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.150014 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11461dcc-2c59-4b98-b246-5908dca9a42b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.150025 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11461dcc-2c59-4b98-b246-5908dca9a42b-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.208381 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.289813 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10d90a40-6c28-4353-91b3-87e966ad1ac7","Type":"ContainerStarted","Data":"5b233df9f31684a2fb0bc56af3cc3eb2a6d3ce84c81b613516b1712903dc0fb1"} Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.290015 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10d90a40-6c28-4353-91b3-87e966ad1ac7","Type":"ContainerStarted","Data":"6abeeaa94a92ae874c00ad860433395130949519fbc9813072ffde49af79c426"} Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.292732 4807 generic.go:334] "Generic (PLEG): container finished" podID="d2148eff-b1b3-45a5-9e9e-0769521c4cb7" containerID="7533ea433989f8f19703bbedb950be9ebc6ced89e5bcd411e1fe4143884a943d" exitCode=0 Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.292790 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" event={"ID":"d2148eff-b1b3-45a5-9e9e-0769521c4cb7","Type":"ContainerDied","Data":"7533ea433989f8f19703bbedb950be9ebc6ced89e5bcd411e1fe4143884a943d"} Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.292813 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" event={"ID":"d2148eff-b1b3-45a5-9e9e-0769521c4cb7","Type":"ContainerDied","Data":"b9444278b2ff43dfc57fb14ec5fa09d67108b967bceb1b4734a0cc321c807997"} Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.292839 4807 scope.go:117] "RemoveContainer" containerID="7533ea433989f8f19703bbedb950be9ebc6ced89e5bcd411e1fe4143884a943d" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.293174 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-c7m8h" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.313790 4807 generic.go:334] "Generic (PLEG): container finished" podID="11461dcc-2c59-4b98-b246-5908dca9a42b" containerID="9ea34be6ca91f9578d0b29737db48edd7fe43b390b84448670c82803b3df2246" exitCode=0 Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.316076 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4j2hj" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.316832 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j2hj" event={"ID":"11461dcc-2c59-4b98-b246-5908dca9a42b","Type":"ContainerDied","Data":"9ea34be6ca91f9578d0b29737db48edd7fe43b390b84448670c82803b3df2246"} Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.316939 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4j2hj" event={"ID":"11461dcc-2c59-4b98-b246-5908dca9a42b","Type":"ContainerDied","Data":"2ff14c1b19a107fd4c3be772cbf49cb2a07fd95f3b4e0ea2bddd770abb5f85fa"} Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.348241 4807 scope.go:117] "RemoveContainer" containerID="b3cfbd259912c35fa2910204c7d82f4bf04097a96271e35732a88eb864ca2c48" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.368522 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-dns-swift-storage-0\") pod \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.368595 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-config\") pod \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.368622 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-dns-svc\") pod \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.368647 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-ovsdbserver-nb\") pod \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.368798 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54mn8\" (UniqueName: \"kubernetes.io/projected/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-kube-api-access-54mn8\") pod \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.368870 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-ovsdbserver-sb\") pod \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\" (UID: \"d2148eff-b1b3-45a5-9e9e-0769521c4cb7\") " Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.395352 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-kube-api-access-54mn8" (OuterVolumeSpecName: "kube-api-access-54mn8") pod "d2148eff-b1b3-45a5-9e9e-0769521c4cb7" (UID: "d2148eff-b1b3-45a5-9e9e-0769521c4cb7"). InnerVolumeSpecName "kube-api-access-54mn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.420360 4807 scope.go:117] "RemoveContainer" containerID="7533ea433989f8f19703bbedb950be9ebc6ced89e5bcd411e1fe4143884a943d" Dec 02 20:24:02 crc kubenswrapper[4807]: E1202 20:24:02.421132 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7533ea433989f8f19703bbedb950be9ebc6ced89e5bcd411e1fe4143884a943d\": container with ID starting with 7533ea433989f8f19703bbedb950be9ebc6ced89e5bcd411e1fe4143884a943d not found: ID does not exist" containerID="7533ea433989f8f19703bbedb950be9ebc6ced89e5bcd411e1fe4143884a943d" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.421228 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7533ea433989f8f19703bbedb950be9ebc6ced89e5bcd411e1fe4143884a943d"} err="failed to get container status \"7533ea433989f8f19703bbedb950be9ebc6ced89e5bcd411e1fe4143884a943d\": rpc error: code = NotFound desc = could not find container \"7533ea433989f8f19703bbedb950be9ebc6ced89e5bcd411e1fe4143884a943d\": container with ID starting with 7533ea433989f8f19703bbedb950be9ebc6ced89e5bcd411e1fe4143884a943d not found: ID does not exist" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.421256 4807 scope.go:117] "RemoveContainer" containerID="b3cfbd259912c35fa2910204c7d82f4bf04097a96271e35732a88eb864ca2c48" Dec 02 20:24:02 crc kubenswrapper[4807]: E1202 20:24:02.422009 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3cfbd259912c35fa2910204c7d82f4bf04097a96271e35732a88eb864ca2c48\": container with ID starting with b3cfbd259912c35fa2910204c7d82f4bf04097a96271e35732a88eb864ca2c48 not found: ID does not exist" containerID="b3cfbd259912c35fa2910204c7d82f4bf04097a96271e35732a88eb864ca2c48" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.422085 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3cfbd259912c35fa2910204c7d82f4bf04097a96271e35732a88eb864ca2c48"} err="failed to get container status \"b3cfbd259912c35fa2910204c7d82f4bf04097a96271e35732a88eb864ca2c48\": rpc error: code = NotFound desc = could not find container \"b3cfbd259912c35fa2910204c7d82f4bf04097a96271e35732a88eb864ca2c48\": container with ID starting with b3cfbd259912c35fa2910204c7d82f4bf04097a96271e35732a88eb864ca2c48 not found: ID does not exist" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.422123 4807 scope.go:117] "RemoveContainer" containerID="9ea34be6ca91f9578d0b29737db48edd7fe43b390b84448670c82803b3df2246" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.424354 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4j2hj"] Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.446659 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d2148eff-b1b3-45a5-9e9e-0769521c4cb7" (UID: "d2148eff-b1b3-45a5-9e9e-0769521c4cb7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.447256 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-config" (OuterVolumeSpecName: "config") pod "d2148eff-b1b3-45a5-9e9e-0769521c4cb7" (UID: "d2148eff-b1b3-45a5-9e9e-0769521c4cb7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.455377 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d2148eff-b1b3-45a5-9e9e-0769521c4cb7" (UID: "d2148eff-b1b3-45a5-9e9e-0769521c4cb7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.456586 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4j2hj"] Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.457849 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d2148eff-b1b3-45a5-9e9e-0769521c4cb7" (UID: "d2148eff-b1b3-45a5-9e9e-0769521c4cb7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.470314 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d2148eff-b1b3-45a5-9e9e-0769521c4cb7" (UID: "d2148eff-b1b3-45a5-9e9e-0769521c4cb7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.476556 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.476615 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.476627 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.476637 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.476647 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54mn8\" (UniqueName: \"kubernetes.io/projected/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-kube-api-access-54mn8\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.476658 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2148eff-b1b3-45a5-9e9e-0769521c4cb7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.563569 4807 scope.go:117] "RemoveContainer" containerID="649306953ad05f24efb11e794d7dd394436fe4ef6e867161d4d9480d65bd6601" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.595253 4807 scope.go:117] "RemoveContainer" containerID="c1bec5bf48c00f954b3ac21a24e3c2b21ae22784767745c914af6e75791790d8" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.657449 4807 scope.go:117] "RemoveContainer" containerID="9ea34be6ca91f9578d0b29737db48edd7fe43b390b84448670c82803b3df2246" Dec 02 20:24:02 crc kubenswrapper[4807]: E1202 20:24:02.658059 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ea34be6ca91f9578d0b29737db48edd7fe43b390b84448670c82803b3df2246\": container with ID starting with 9ea34be6ca91f9578d0b29737db48edd7fe43b390b84448670c82803b3df2246 not found: ID does not exist" containerID="9ea34be6ca91f9578d0b29737db48edd7fe43b390b84448670c82803b3df2246" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.658093 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea34be6ca91f9578d0b29737db48edd7fe43b390b84448670c82803b3df2246"} err="failed to get container status \"9ea34be6ca91f9578d0b29737db48edd7fe43b390b84448670c82803b3df2246\": rpc error: code = NotFound desc = could not find container \"9ea34be6ca91f9578d0b29737db48edd7fe43b390b84448670c82803b3df2246\": container with ID starting with 9ea34be6ca91f9578d0b29737db48edd7fe43b390b84448670c82803b3df2246 not found: ID does not exist" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.658124 4807 scope.go:117] "RemoveContainer" containerID="649306953ad05f24efb11e794d7dd394436fe4ef6e867161d4d9480d65bd6601" Dec 02 20:24:02 crc kubenswrapper[4807]: E1202 20:24:02.658373 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"649306953ad05f24efb11e794d7dd394436fe4ef6e867161d4d9480d65bd6601\": container with ID starting with 649306953ad05f24efb11e794d7dd394436fe4ef6e867161d4d9480d65bd6601 not found: ID does not exist" containerID="649306953ad05f24efb11e794d7dd394436fe4ef6e867161d4d9480d65bd6601" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.658396 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"649306953ad05f24efb11e794d7dd394436fe4ef6e867161d4d9480d65bd6601"} err="failed to get container status \"649306953ad05f24efb11e794d7dd394436fe4ef6e867161d4d9480d65bd6601\": rpc error: code = NotFound desc = could not find container \"649306953ad05f24efb11e794d7dd394436fe4ef6e867161d4d9480d65bd6601\": container with ID starting with 649306953ad05f24efb11e794d7dd394436fe4ef6e867161d4d9480d65bd6601 not found: ID does not exist" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.658411 4807 scope.go:117] "RemoveContainer" containerID="c1bec5bf48c00f954b3ac21a24e3c2b21ae22784767745c914af6e75791790d8" Dec 02 20:24:02 crc kubenswrapper[4807]: E1202 20:24:02.658660 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1bec5bf48c00f954b3ac21a24e3c2b21ae22784767745c914af6e75791790d8\": container with ID starting with c1bec5bf48c00f954b3ac21a24e3c2b21ae22784767745c914af6e75791790d8 not found: ID does not exist" containerID="c1bec5bf48c00f954b3ac21a24e3c2b21ae22784767745c914af6e75791790d8" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.658691 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1bec5bf48c00f954b3ac21a24e3c2b21ae22784767745c914af6e75791790d8"} err="failed to get container status \"c1bec5bf48c00f954b3ac21a24e3c2b21ae22784767745c914af6e75791790d8\": rpc error: code = NotFound desc = could not find container \"c1bec5bf48c00f954b3ac21a24e3c2b21ae22784767745c914af6e75791790d8\": container with ID starting with c1bec5bf48c00f954b3ac21a24e3c2b21ae22784767745c914af6e75791790d8 not found: ID does not exist" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.660852 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-c7m8h"] Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.675740 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-c7m8h"] Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.991227 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11461dcc-2c59-4b98-b246-5908dca9a42b" path="/var/lib/kubelet/pods/11461dcc-2c59-4b98-b246-5908dca9a42b/volumes" Dec 02 20:24:02 crc kubenswrapper[4807]: I1202 20:24:02.994648 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2148eff-b1b3-45a5-9e9e-0769521c4cb7" path="/var/lib/kubelet/pods/d2148eff-b1b3-45a5-9e9e-0769521c4cb7/volumes" Dec 02 20:24:04 crc kubenswrapper[4807]: I1202 20:24:04.346582 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"10d90a40-6c28-4353-91b3-87e966ad1ac7","Type":"ContainerStarted","Data":"89adc92ce7b241b35584c45c0404ba81295f4d3aef3a3ab432b68c0b4dae5cee"} Dec 02 20:24:04 crc kubenswrapper[4807]: I1202 20:24:04.347341 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 20:24:04 crc kubenswrapper[4807]: I1202 20:24:04.388524 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.381704997 podStartE2EDuration="6.388500511s" podCreationTimestamp="2025-12-02 20:23:58 +0000 UTC" firstStartedPulling="2025-12-02 20:23:59.482173929 +0000 UTC m=+1574.783081424" lastFinishedPulling="2025-12-02 20:24:03.488969443 +0000 UTC m=+1578.789876938" observedRunningTime="2025-12-02 20:24:04.385574714 +0000 UTC m=+1579.686482219" watchObservedRunningTime="2025-12-02 20:24:04.388500511 +0000 UTC m=+1579.689408006" Dec 02 20:24:07 crc kubenswrapper[4807]: I1202 20:24:07.396694 4807 generic.go:334] "Generic (PLEG): container finished" podID="dd0c1a2e-9558-4538-8e36-aa4def438cc1" containerID="d8df39f6a448db0c81318fae0d805a9afcfff9aaedf924c7d66119bc0a63d1a9" exitCode=0 Dec 02 20:24:07 crc kubenswrapper[4807]: I1202 20:24:07.396989 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fzzjd" event={"ID":"dd0c1a2e-9558-4538-8e36-aa4def438cc1","Type":"ContainerDied","Data":"d8df39f6a448db0c81318fae0d805a9afcfff9aaedf924c7d66119bc0a63d1a9"} Dec 02 20:24:08 crc kubenswrapper[4807]: I1202 20:24:08.470823 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 20:24:08 crc kubenswrapper[4807]: I1202 20:24:08.471513 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:08.888643 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fzzjd" Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.068122 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-combined-ca-bundle\") pod \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\" (UID: \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\") " Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.068201 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-scripts\") pod \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\" (UID: \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\") " Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.068310 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-config-data\") pod \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\" (UID: \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\") " Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.068637 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znwhd\" (UniqueName: \"kubernetes.io/projected/dd0c1a2e-9558-4538-8e36-aa4def438cc1-kube-api-access-znwhd\") pod \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\" (UID: \"dd0c1a2e-9558-4538-8e36-aa4def438cc1\") " Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.089520 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-scripts" (OuterVolumeSpecName: "scripts") pod "dd0c1a2e-9558-4538-8e36-aa4def438cc1" (UID: "dd0c1a2e-9558-4538-8e36-aa4def438cc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.089788 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd0c1a2e-9558-4538-8e36-aa4def438cc1-kube-api-access-znwhd" (OuterVolumeSpecName: "kube-api-access-znwhd") pod "dd0c1a2e-9558-4538-8e36-aa4def438cc1" (UID: "dd0c1a2e-9558-4538-8e36-aa4def438cc1"). InnerVolumeSpecName "kube-api-access-znwhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.114641 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-config-data" (OuterVolumeSpecName: "config-data") pod "dd0c1a2e-9558-4538-8e36-aa4def438cc1" (UID: "dd0c1a2e-9558-4538-8e36-aa4def438cc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.125442 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd0c1a2e-9558-4538-8e36-aa4def438cc1" (UID: "dd0c1a2e-9558-4538-8e36-aa4def438cc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.172511 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znwhd\" (UniqueName: \"kubernetes.io/projected/dd0c1a2e-9558-4538-8e36-aa4def438cc1-kube-api-access-znwhd\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.172552 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.172561 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.172573 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd0c1a2e-9558-4538-8e36-aa4def438cc1-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.426640 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fzzjd" event={"ID":"dd0c1a2e-9558-4538-8e36-aa4def438cc1","Type":"ContainerDied","Data":"8fd28d7b2017115274a2b1172b871f09c49dfdedf07e40d643c790d0d337d088"} Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.426699 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fd28d7b2017115274a2b1172b871f09c49dfdedf07e40d643c790d0d337d088" Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.426963 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fzzjd" Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.495975 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.219:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.495975 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.219:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.640252 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.640776 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6d49888e-5c74-4550-b491-8b526b9fe3a8" containerName="nova-scheduler-scheduler" containerID="cri-o://d9c534ccd25cbca66b8c1cb2a6f93fd09e50be6710183062e4025e1b1fb1780e" gracePeriod=30 Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.658076 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.658757 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" containerName="nova-api-log" containerID="cri-o://f653b180249a9a4e8bde1e96a28376e953a014feaa1d94c86eafbf91ac6c9f9b" gracePeriod=30 Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.658918 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" containerName="nova-api-api" containerID="cri-o://86e6601427412c7c3af59e9e2879f74c1dc371d5461e4542966ab7f4b570ad15" gracePeriod=30 Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.740199 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.740573 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="93b166c0-bf72-4774-97dc-f22b3b02f15a" containerName="nova-metadata-log" containerID="cri-o://78ce639c08d471f6b7f4abc5a9a6915b418f13589906c6bc30d94bbccbaf46b6" gracePeriod=30 Dec 02 20:24:09 crc kubenswrapper[4807]: I1202 20:24:09.740839 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="93b166c0-bf72-4774-97dc-f22b3b02f15a" containerName="nova-metadata-metadata" containerID="cri-o://7c5abb924772793a51bb041fe8197748c0a23a5cb9eb2a2ab3bcb40974409409" gracePeriod=30 Dec 02 20:24:10 crc kubenswrapper[4807]: I1202 20:24:10.441098 4807 generic.go:334] "Generic (PLEG): container finished" podID="8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" containerID="f653b180249a9a4e8bde1e96a28376e953a014feaa1d94c86eafbf91ac6c9f9b" exitCode=143 Dec 02 20:24:10 crc kubenswrapper[4807]: I1202 20:24:10.441305 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371","Type":"ContainerDied","Data":"f653b180249a9a4e8bde1e96a28376e953a014feaa1d94c86eafbf91ac6c9f9b"} Dec 02 20:24:10 crc kubenswrapper[4807]: I1202 20:24:10.444404 4807 generic.go:334] "Generic (PLEG): container finished" podID="93b166c0-bf72-4774-97dc-f22b3b02f15a" containerID="78ce639c08d471f6b7f4abc5a9a6915b418f13589906c6bc30d94bbccbaf46b6" exitCode=143 Dec 02 20:24:10 crc kubenswrapper[4807]: I1202 20:24:10.444443 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93b166c0-bf72-4774-97dc-f22b3b02f15a","Type":"ContainerDied","Data":"78ce639c08d471f6b7f4abc5a9a6915b418f13589906c6bc30d94bbccbaf46b6"} Dec 02 20:24:11 crc kubenswrapper[4807]: I1202 20:24:11.459581 4807 generic.go:334] "Generic (PLEG): container finished" podID="6d49888e-5c74-4550-b491-8b526b9fe3a8" containerID="d9c534ccd25cbca66b8c1cb2a6f93fd09e50be6710183062e4025e1b1fb1780e" exitCode=0 Dec 02 20:24:11 crc kubenswrapper[4807]: I1202 20:24:11.459656 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6d49888e-5c74-4550-b491-8b526b9fe3a8","Type":"ContainerDied","Data":"d9c534ccd25cbca66b8c1cb2a6f93fd09e50be6710183062e4025e1b1fb1780e"} Dec 02 20:24:11 crc kubenswrapper[4807]: I1202 20:24:11.609175 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 20:24:11 crc kubenswrapper[4807]: I1202 20:24:11.737622 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdnt2\" (UniqueName: \"kubernetes.io/projected/6d49888e-5c74-4550-b491-8b526b9fe3a8-kube-api-access-hdnt2\") pod \"6d49888e-5c74-4550-b491-8b526b9fe3a8\" (UID: \"6d49888e-5c74-4550-b491-8b526b9fe3a8\") " Dec 02 20:24:11 crc kubenswrapper[4807]: I1202 20:24:11.738449 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d49888e-5c74-4550-b491-8b526b9fe3a8-combined-ca-bundle\") pod \"6d49888e-5c74-4550-b491-8b526b9fe3a8\" (UID: \"6d49888e-5c74-4550-b491-8b526b9fe3a8\") " Dec 02 20:24:11 crc kubenswrapper[4807]: I1202 20:24:11.738549 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d49888e-5c74-4550-b491-8b526b9fe3a8-config-data\") pod \"6d49888e-5c74-4550-b491-8b526b9fe3a8\" (UID: \"6d49888e-5c74-4550-b491-8b526b9fe3a8\") " Dec 02 20:24:11 crc kubenswrapper[4807]: I1202 20:24:11.746675 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d49888e-5c74-4550-b491-8b526b9fe3a8-kube-api-access-hdnt2" (OuterVolumeSpecName: "kube-api-access-hdnt2") pod "6d49888e-5c74-4550-b491-8b526b9fe3a8" (UID: "6d49888e-5c74-4550-b491-8b526b9fe3a8"). InnerVolumeSpecName "kube-api-access-hdnt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:24:11 crc kubenswrapper[4807]: I1202 20:24:11.778918 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d49888e-5c74-4550-b491-8b526b9fe3a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d49888e-5c74-4550-b491-8b526b9fe3a8" (UID: "6d49888e-5c74-4550-b491-8b526b9fe3a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:24:11 crc kubenswrapper[4807]: I1202 20:24:11.779006 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d49888e-5c74-4550-b491-8b526b9fe3a8-config-data" (OuterVolumeSpecName: "config-data") pod "6d49888e-5c74-4550-b491-8b526b9fe3a8" (UID: "6d49888e-5c74-4550-b491-8b526b9fe3a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:24:11 crc kubenswrapper[4807]: I1202 20:24:11.841623 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d49888e-5c74-4550-b491-8b526b9fe3a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:11 crc kubenswrapper[4807]: I1202 20:24:11.841691 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d49888e-5c74-4550-b491-8b526b9fe3a8-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:11 crc kubenswrapper[4807]: I1202 20:24:11.841706 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdnt2\" (UniqueName: \"kubernetes.io/projected/6d49888e-5c74-4550-b491-8b526b9fe3a8-kube-api-access-hdnt2\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.480561 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6d49888e-5c74-4550-b491-8b526b9fe3a8","Type":"ContainerDied","Data":"159cbfde0be03e78fd11b892b87f2ed9888e3a56d2a0a1cb2008be2d4b303599"} Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.480636 4807 scope.go:117] "RemoveContainer" containerID="d9c534ccd25cbca66b8c1cb2a6f93fd09e50be6710183062e4025e1b1fb1780e" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.480879 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.543190 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.570357 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.599572 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 20:24:12 crc kubenswrapper[4807]: E1202 20:24:12.600668 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11461dcc-2c59-4b98-b246-5908dca9a42b" containerName="registry-server" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.600685 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="11461dcc-2c59-4b98-b246-5908dca9a42b" containerName="registry-server" Dec 02 20:24:12 crc kubenswrapper[4807]: E1202 20:24:12.600700 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2148eff-b1b3-45a5-9e9e-0769521c4cb7" containerName="init" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.600707 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2148eff-b1b3-45a5-9e9e-0769521c4cb7" containerName="init" Dec 02 20:24:12 crc kubenswrapper[4807]: E1202 20:24:12.600735 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd0c1a2e-9558-4538-8e36-aa4def438cc1" containerName="nova-manage" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.600743 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd0c1a2e-9558-4538-8e36-aa4def438cc1" containerName="nova-manage" Dec 02 20:24:12 crc kubenswrapper[4807]: E1202 20:24:12.600776 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2148eff-b1b3-45a5-9e9e-0769521c4cb7" containerName="dnsmasq-dns" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.600783 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2148eff-b1b3-45a5-9e9e-0769521c4cb7" containerName="dnsmasq-dns" Dec 02 20:24:12 crc kubenswrapper[4807]: E1202 20:24:12.600799 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11461dcc-2c59-4b98-b246-5908dca9a42b" containerName="extract-content" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.600806 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="11461dcc-2c59-4b98-b246-5908dca9a42b" containerName="extract-content" Dec 02 20:24:12 crc kubenswrapper[4807]: E1202 20:24:12.600843 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11461dcc-2c59-4b98-b246-5908dca9a42b" containerName="extract-utilities" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.600851 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="11461dcc-2c59-4b98-b246-5908dca9a42b" containerName="extract-utilities" Dec 02 20:24:12 crc kubenswrapper[4807]: E1202 20:24:12.600864 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d49888e-5c74-4550-b491-8b526b9fe3a8" containerName="nova-scheduler-scheduler" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.600870 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d49888e-5c74-4550-b491-8b526b9fe3a8" containerName="nova-scheduler-scheduler" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.601260 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2148eff-b1b3-45a5-9e9e-0769521c4cb7" containerName="dnsmasq-dns" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.601284 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="11461dcc-2c59-4b98-b246-5908dca9a42b" containerName="registry-server" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.601301 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d49888e-5c74-4550-b491-8b526b9fe3a8" containerName="nova-scheduler-scheduler" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.601331 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd0c1a2e-9558-4538-8e36-aa4def438cc1" containerName="nova-manage" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.602596 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.608148 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.631144 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.769973 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db3f424-7a28-419f-b5e1-0dec9279d417-config-data\") pod \"nova-scheduler-0\" (UID: \"3db3f424-7a28-419f-b5e1-0dec9279d417\") " pod="openstack/nova-scheduler-0" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.770505 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwvqf\" (UniqueName: \"kubernetes.io/projected/3db3f424-7a28-419f-b5e1-0dec9279d417-kube-api-access-pwvqf\") pod \"nova-scheduler-0\" (UID: \"3db3f424-7a28-419f-b5e1-0dec9279d417\") " pod="openstack/nova-scheduler-0" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.770617 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db3f424-7a28-419f-b5e1-0dec9279d417-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3db3f424-7a28-419f-b5e1-0dec9279d417\") " pod="openstack/nova-scheduler-0" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.873185 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db3f424-7a28-419f-b5e1-0dec9279d417-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3db3f424-7a28-419f-b5e1-0dec9279d417\") " pod="openstack/nova-scheduler-0" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.873317 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db3f424-7a28-419f-b5e1-0dec9279d417-config-data\") pod \"nova-scheduler-0\" (UID: \"3db3f424-7a28-419f-b5e1-0dec9279d417\") " pod="openstack/nova-scheduler-0" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.873389 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwvqf\" (UniqueName: \"kubernetes.io/projected/3db3f424-7a28-419f-b5e1-0dec9279d417-kube-api-access-pwvqf\") pod \"nova-scheduler-0\" (UID: \"3db3f424-7a28-419f-b5e1-0dec9279d417\") " pod="openstack/nova-scheduler-0" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.881620 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db3f424-7a28-419f-b5e1-0dec9279d417-config-data\") pod \"nova-scheduler-0\" (UID: \"3db3f424-7a28-419f-b5e1-0dec9279d417\") " pod="openstack/nova-scheduler-0" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.882882 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="93b166c0-bf72-4774-97dc-f22b3b02f15a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": read tcp 10.217.0.2:55490->10.217.0.211:8775: read: connection reset by peer" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.883087 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="93b166c0-bf72-4774-97dc-f22b3b02f15a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": read tcp 10.217.0.2:55484->10.217.0.211:8775: read: connection reset by peer" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.885427 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db3f424-7a28-419f-b5e1-0dec9279d417-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3db3f424-7a28-419f-b5e1-0dec9279d417\") " pod="openstack/nova-scheduler-0" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.891663 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwvqf\" (UniqueName: \"kubernetes.io/projected/3db3f424-7a28-419f-b5e1-0dec9279d417-kube-api-access-pwvqf\") pod \"nova-scheduler-0\" (UID: \"3db3f424-7a28-419f-b5e1-0dec9279d417\") " pod="openstack/nova-scheduler-0" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.939508 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 20:24:12 crc kubenswrapper[4807]: I1202 20:24:12.993632 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d49888e-5c74-4550-b491-8b526b9fe3a8" path="/var/lib/kubelet/pods/6d49888e-5c74-4550-b491-8b526b9fe3a8/volumes" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.335113 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.391854 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-config-data\") pod \"93b166c0-bf72-4774-97dc-f22b3b02f15a\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.391953 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khks6\" (UniqueName: \"kubernetes.io/projected/93b166c0-bf72-4774-97dc-f22b3b02f15a-kube-api-access-khks6\") pod \"93b166c0-bf72-4774-97dc-f22b3b02f15a\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.392088 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93b166c0-bf72-4774-97dc-f22b3b02f15a-logs\") pod \"93b166c0-bf72-4774-97dc-f22b3b02f15a\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.392191 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-nova-metadata-tls-certs\") pod \"93b166c0-bf72-4774-97dc-f22b3b02f15a\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.392368 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-combined-ca-bundle\") pod \"93b166c0-bf72-4774-97dc-f22b3b02f15a\" (UID: \"93b166c0-bf72-4774-97dc-f22b3b02f15a\") " Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.393706 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93b166c0-bf72-4774-97dc-f22b3b02f15a-logs" (OuterVolumeSpecName: "logs") pod "93b166c0-bf72-4774-97dc-f22b3b02f15a" (UID: "93b166c0-bf72-4774-97dc-f22b3b02f15a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.394149 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93b166c0-bf72-4774-97dc-f22b3b02f15a-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.400995 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93b166c0-bf72-4774-97dc-f22b3b02f15a-kube-api-access-khks6" (OuterVolumeSpecName: "kube-api-access-khks6") pod "93b166c0-bf72-4774-97dc-f22b3b02f15a" (UID: "93b166c0-bf72-4774-97dc-f22b3b02f15a"). InnerVolumeSpecName "kube-api-access-khks6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.445986 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93b166c0-bf72-4774-97dc-f22b3b02f15a" (UID: "93b166c0-bf72-4774-97dc-f22b3b02f15a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.453893 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-config-data" (OuterVolumeSpecName: "config-data") pod "93b166c0-bf72-4774-97dc-f22b3b02f15a" (UID: "93b166c0-bf72-4774-97dc-f22b3b02f15a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.483221 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "93b166c0-bf72-4774-97dc-f22b3b02f15a" (UID: "93b166c0-bf72-4774-97dc-f22b3b02f15a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.497344 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.498257 4807 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.498981 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.499010 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b166c0-bf72-4774-97dc-f22b3b02f15a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.499054 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khks6\" (UniqueName: \"kubernetes.io/projected/93b166c0-bf72-4774-97dc-f22b3b02f15a-kube-api-access-khks6\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.516512 4807 generic.go:334] "Generic (PLEG): container finished" podID="93b166c0-bf72-4774-97dc-f22b3b02f15a" containerID="7c5abb924772793a51bb041fe8197748c0a23a5cb9eb2a2ab3bcb40974409409" exitCode=0 Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.516603 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93b166c0-bf72-4774-97dc-f22b3b02f15a","Type":"ContainerDied","Data":"7c5abb924772793a51bb041fe8197748c0a23a5cb9eb2a2ab3bcb40974409409"} Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.516644 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"93b166c0-bf72-4774-97dc-f22b3b02f15a","Type":"ContainerDied","Data":"003dd17181e1d63f72bfd87568cb33bae48e5fc6567ac3258b6d7bbfafa01eb7"} Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.516668 4807 scope.go:117] "RemoveContainer" containerID="7c5abb924772793a51bb041fe8197748c0a23a5cb9eb2a2ab3bcb40974409409" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.516693 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.557814 4807 scope.go:117] "RemoveContainer" containerID="78ce639c08d471f6b7f4abc5a9a6915b418f13589906c6bc30d94bbccbaf46b6" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.567612 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.585437 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.600865 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:24:13 crc kubenswrapper[4807]: E1202 20:24:13.601609 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b166c0-bf72-4774-97dc-f22b3b02f15a" containerName="nova-metadata-metadata" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.601632 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b166c0-bf72-4774-97dc-f22b3b02f15a" containerName="nova-metadata-metadata" Dec 02 20:24:13 crc kubenswrapper[4807]: E1202 20:24:13.601698 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b166c0-bf72-4774-97dc-f22b3b02f15a" containerName="nova-metadata-log" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.601706 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b166c0-bf72-4774-97dc-f22b3b02f15a" containerName="nova-metadata-log" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.601969 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b166c0-bf72-4774-97dc-f22b3b02f15a" containerName="nova-metadata-metadata" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.601992 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b166c0-bf72-4774-97dc-f22b3b02f15a" containerName="nova-metadata-log" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.604239 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.609778 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.611783 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.614919 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.625330 4807 scope.go:117] "RemoveContainer" containerID="7c5abb924772793a51bb041fe8197748c0a23a5cb9eb2a2ab3bcb40974409409" Dec 02 20:24:13 crc kubenswrapper[4807]: E1202 20:24:13.626435 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c5abb924772793a51bb041fe8197748c0a23a5cb9eb2a2ab3bcb40974409409\": container with ID starting with 7c5abb924772793a51bb041fe8197748c0a23a5cb9eb2a2ab3bcb40974409409 not found: ID does not exist" containerID="7c5abb924772793a51bb041fe8197748c0a23a5cb9eb2a2ab3bcb40974409409" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.626491 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c5abb924772793a51bb041fe8197748c0a23a5cb9eb2a2ab3bcb40974409409"} err="failed to get container status \"7c5abb924772793a51bb041fe8197748c0a23a5cb9eb2a2ab3bcb40974409409\": rpc error: code = NotFound desc = could not find container \"7c5abb924772793a51bb041fe8197748c0a23a5cb9eb2a2ab3bcb40974409409\": container with ID starting with 7c5abb924772793a51bb041fe8197748c0a23a5cb9eb2a2ab3bcb40974409409 not found: ID does not exist" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.626530 4807 scope.go:117] "RemoveContainer" containerID="78ce639c08d471f6b7f4abc5a9a6915b418f13589906c6bc30d94bbccbaf46b6" Dec 02 20:24:13 crc kubenswrapper[4807]: E1202 20:24:13.626876 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78ce639c08d471f6b7f4abc5a9a6915b418f13589906c6bc30d94bbccbaf46b6\": container with ID starting with 78ce639c08d471f6b7f4abc5a9a6915b418f13589906c6bc30d94bbccbaf46b6 not found: ID does not exist" containerID="78ce639c08d471f6b7f4abc5a9a6915b418f13589906c6bc30d94bbccbaf46b6" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.626905 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78ce639c08d471f6b7f4abc5a9a6915b418f13589906c6bc30d94bbccbaf46b6"} err="failed to get container status \"78ce639c08d471f6b7f4abc5a9a6915b418f13589906c6bc30d94bbccbaf46b6\": rpc error: code = NotFound desc = could not find container \"78ce639c08d471f6b7f4abc5a9a6915b418f13589906c6bc30d94bbccbaf46b6\": container with ID starting with 78ce639c08d471f6b7f4abc5a9a6915b418f13589906c6bc30d94bbccbaf46b6 not found: ID does not exist" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.710647 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96fcc760-d492-4e5b-8d31-de6c7f49b47f-logs\") pod \"nova-metadata-0\" (UID: \"96fcc760-d492-4e5b-8d31-de6c7f49b47f\") " pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.710749 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96fcc760-d492-4e5b-8d31-de6c7f49b47f-config-data\") pod \"nova-metadata-0\" (UID: \"96fcc760-d492-4e5b-8d31-de6c7f49b47f\") " pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.710961 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96fcc760-d492-4e5b-8d31-de6c7f49b47f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"96fcc760-d492-4e5b-8d31-de6c7f49b47f\") " pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.711007 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88nzv\" (UniqueName: \"kubernetes.io/projected/96fcc760-d492-4e5b-8d31-de6c7f49b47f-kube-api-access-88nzv\") pod \"nova-metadata-0\" (UID: \"96fcc760-d492-4e5b-8d31-de6c7f49b47f\") " pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.711407 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/96fcc760-d492-4e5b-8d31-de6c7f49b47f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"96fcc760-d492-4e5b-8d31-de6c7f49b47f\") " pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.814155 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96fcc760-d492-4e5b-8d31-de6c7f49b47f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"96fcc760-d492-4e5b-8d31-de6c7f49b47f\") " pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.814538 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88nzv\" (UniqueName: \"kubernetes.io/projected/96fcc760-d492-4e5b-8d31-de6c7f49b47f-kube-api-access-88nzv\") pod \"nova-metadata-0\" (UID: \"96fcc760-d492-4e5b-8d31-de6c7f49b47f\") " pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.814764 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/96fcc760-d492-4e5b-8d31-de6c7f49b47f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"96fcc760-d492-4e5b-8d31-de6c7f49b47f\") " pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.814959 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96fcc760-d492-4e5b-8d31-de6c7f49b47f-logs\") pod \"nova-metadata-0\" (UID: \"96fcc760-d492-4e5b-8d31-de6c7f49b47f\") " pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.815295 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96fcc760-d492-4e5b-8d31-de6c7f49b47f-config-data\") pod \"nova-metadata-0\" (UID: \"96fcc760-d492-4e5b-8d31-de6c7f49b47f\") " pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.815433 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96fcc760-d492-4e5b-8d31-de6c7f49b47f-logs\") pod \"nova-metadata-0\" (UID: \"96fcc760-d492-4e5b-8d31-de6c7f49b47f\") " pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.818706 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96fcc760-d492-4e5b-8d31-de6c7f49b47f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"96fcc760-d492-4e5b-8d31-de6c7f49b47f\") " pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.819053 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/96fcc760-d492-4e5b-8d31-de6c7f49b47f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"96fcc760-d492-4e5b-8d31-de6c7f49b47f\") " pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.820623 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96fcc760-d492-4e5b-8d31-de6c7f49b47f-config-data\") pod \"nova-metadata-0\" (UID: \"96fcc760-d492-4e5b-8d31-de6c7f49b47f\") " pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.837971 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88nzv\" (UniqueName: \"kubernetes.io/projected/96fcc760-d492-4e5b-8d31-de6c7f49b47f-kube-api-access-88nzv\") pod \"nova-metadata-0\" (UID: \"96fcc760-d492-4e5b-8d31-de6c7f49b47f\") " pod="openstack/nova-metadata-0" Dec 02 20:24:13 crc kubenswrapper[4807]: I1202 20:24:13.937826 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 20:24:14 crc kubenswrapper[4807]: W1202 20:24:14.449856 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96fcc760_d492_4e5b_8d31_de6c7f49b47f.slice/crio-cb2a897bfb1f65fc77ce55691fa73c54c0c03a761a64d0a0a75ae887e293f07f WatchSource:0}: Error finding container cb2a897bfb1f65fc77ce55691fa73c54c0c03a761a64d0a0a75ae887e293f07f: Status 404 returned error can't find the container with id cb2a897bfb1f65fc77ce55691fa73c54c0c03a761a64d0a0a75ae887e293f07f Dec 02 20:24:14 crc kubenswrapper[4807]: I1202 20:24:14.471174 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 20:24:14 crc kubenswrapper[4807]: I1202 20:24:14.540165 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"96fcc760-d492-4e5b-8d31-de6c7f49b47f","Type":"ContainerStarted","Data":"cb2a897bfb1f65fc77ce55691fa73c54c0c03a761a64d0a0a75ae887e293f07f"} Dec 02 20:24:14 crc kubenswrapper[4807]: I1202 20:24:14.543109 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3db3f424-7a28-419f-b5e1-0dec9279d417","Type":"ContainerStarted","Data":"baa4a2d2d3af5fe19e80551f16ceade4dada71e5d5a563c2261ac64e4e4b84f4"} Dec 02 20:24:14 crc kubenswrapper[4807]: I1202 20:24:14.543149 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3db3f424-7a28-419f-b5e1-0dec9279d417","Type":"ContainerStarted","Data":"bbc0549e2fe58c62d5bfe7243dbb4c45e9f25f41cc044a8e3448ebf7c2116075"} Dec 02 20:24:14 crc kubenswrapper[4807]: I1202 20:24:14.569882 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.569857468 podStartE2EDuration="2.569857468s" podCreationTimestamp="2025-12-02 20:24:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:24:14.564020144 +0000 UTC m=+1589.864927639" watchObservedRunningTime="2025-12-02 20:24:14.569857468 +0000 UTC m=+1589.870764963" Dec 02 20:24:14 crc kubenswrapper[4807]: I1202 20:24:14.994412 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93b166c0-bf72-4774-97dc-f22b3b02f15a" path="/var/lib/kubelet/pods/93b166c0-bf72-4774-97dc-f22b3b02f15a/volumes" Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.559040 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"96fcc760-d492-4e5b-8d31-de6c7f49b47f","Type":"ContainerStarted","Data":"3c212376af511e2325af0f7d5dfcc5787aad19695b26babc78fd412ca8f370a8"} Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.559491 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"96fcc760-d492-4e5b-8d31-de6c7f49b47f","Type":"ContainerStarted","Data":"43c5ea4f062410b0dd428ba6633cca510b2f89d18ee823aed0fdb7c7df9f2175"} Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.563503 4807 generic.go:334] "Generic (PLEG): container finished" podID="8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" containerID="86e6601427412c7c3af59e9e2879f74c1dc371d5461e4542966ab7f4b570ad15" exitCode=0 Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.563773 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371","Type":"ContainerDied","Data":"86e6601427412c7c3af59e9e2879f74c1dc371d5461e4542966ab7f4b570ad15"} Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.563824 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371","Type":"ContainerDied","Data":"8a1c3e17bf3d9452eb837c46884f4e42c81bd255659c4c331a754b590d8f19c3"} Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.563839 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a1c3e17bf3d9452eb837c46884f4e42c81bd255659c4c331a754b590d8f19c3" Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.586077 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.586046481 podStartE2EDuration="2.586046481s" podCreationTimestamp="2025-12-02 20:24:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:24:15.581085923 +0000 UTC m=+1590.881993438" watchObservedRunningTime="2025-12-02 20:24:15.586046481 +0000 UTC m=+1590.886953976" Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.618402 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.662126 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-internal-tls-certs\") pod \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.662220 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-combined-ca-bundle\") pod \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.662487 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26tms\" (UniqueName: \"kubernetes.io/projected/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-kube-api-access-26tms\") pod \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.662517 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-public-tls-certs\") pod \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.662635 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-logs\") pod \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.662785 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-config-data\") pod \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\" (UID: \"8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371\") " Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.663344 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-logs" (OuterVolumeSpecName: "logs") pod "8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" (UID: "8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.664156 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-logs\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.680230 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-kube-api-access-26tms" (OuterVolumeSpecName: "kube-api-access-26tms") pod "8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" (UID: "8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371"). InnerVolumeSpecName "kube-api-access-26tms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.701853 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-config-data" (OuterVolumeSpecName: "config-data") pod "8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" (UID: "8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.705525 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" (UID: "8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.725205 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" (UID: "8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.736671 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" (UID: "8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.766091 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26tms\" (UniqueName: \"kubernetes.io/projected/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-kube-api-access-26tms\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.766142 4807 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.766157 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.766170 4807 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:15 crc kubenswrapper[4807]: I1202 20:24:15.766179 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.573120 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.618976 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.630867 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.665607 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 20:24:16 crc kubenswrapper[4807]: E1202 20:24:16.666652 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" containerName="nova-api-api" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.666694 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" containerName="nova-api-api" Dec 02 20:24:16 crc kubenswrapper[4807]: E1202 20:24:16.666759 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" containerName="nova-api-log" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.666772 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" containerName="nova-api-log" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.667191 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" containerName="nova-api-api" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.667229 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" containerName="nova-api-log" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.669913 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.673050 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.676925 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.683644 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.690000 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9caa6001-4e75-4042-998f-9f00f49ef173-public-tls-certs\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.690074 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfsxx\" (UniqueName: \"kubernetes.io/projected/9caa6001-4e75-4042-998f-9f00f49ef173-kube-api-access-mfsxx\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.690309 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9caa6001-4e75-4042-998f-9f00f49ef173-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.690394 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9caa6001-4e75-4042-998f-9f00f49ef173-logs\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.690794 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9caa6001-4e75-4042-998f-9f00f49ef173-config-data\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.690912 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9caa6001-4e75-4042-998f-9f00f49ef173-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.694648 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.793767 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9caa6001-4e75-4042-998f-9f00f49ef173-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.793893 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9caa6001-4e75-4042-998f-9f00f49ef173-public-tls-certs\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.793925 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfsxx\" (UniqueName: \"kubernetes.io/projected/9caa6001-4e75-4042-998f-9f00f49ef173-kube-api-access-mfsxx\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.793993 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9caa6001-4e75-4042-998f-9f00f49ef173-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.794027 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9caa6001-4e75-4042-998f-9f00f49ef173-logs\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.794136 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9caa6001-4e75-4042-998f-9f00f49ef173-config-data\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.794566 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9caa6001-4e75-4042-998f-9f00f49ef173-logs\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.797762 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9caa6001-4e75-4042-998f-9f00f49ef173-public-tls-certs\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.797962 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9caa6001-4e75-4042-998f-9f00f49ef173-config-data\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.800694 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9caa6001-4e75-4042-998f-9f00f49ef173-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.802972 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9caa6001-4e75-4042-998f-9f00f49ef173-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.813392 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfsxx\" (UniqueName: \"kubernetes.io/projected/9caa6001-4e75-4042-998f-9f00f49ef173-kube-api-access-mfsxx\") pod \"nova-api-0\" (UID: \"9caa6001-4e75-4042-998f-9f00f49ef173\") " pod="openstack/nova-api-0" Dec 02 20:24:16 crc kubenswrapper[4807]: I1202 20:24:16.989646 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371" path="/var/lib/kubelet/pods/8ca3bb70-0ba4-47cf-bbfb-cd8c34b79371/volumes" Dec 02 20:24:17 crc kubenswrapper[4807]: I1202 20:24:17.008689 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 20:24:17 crc kubenswrapper[4807]: I1202 20:24:17.503403 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 20:24:17 crc kubenswrapper[4807]: I1202 20:24:17.599968 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9caa6001-4e75-4042-998f-9f00f49ef173","Type":"ContainerStarted","Data":"dbd0ce096eeb87c9244d57a28b0ed19d10d8aaa539b1acfc3864d00aa4aafd96"} Dec 02 20:24:17 crc kubenswrapper[4807]: I1202 20:24:17.940744 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 20:24:18 crc kubenswrapper[4807]: I1202 20:24:18.615743 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9caa6001-4e75-4042-998f-9f00f49ef173","Type":"ContainerStarted","Data":"180dd7520607bcacbaf8c755cf5bc4bff1f5e4739e0221a47c8c1c6fdee5bc05"} Dec 02 20:24:18 crc kubenswrapper[4807]: I1202 20:24:18.615817 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9caa6001-4e75-4042-998f-9f00f49ef173","Type":"ContainerStarted","Data":"f81fc1a29ef1bd0af54d4d15f358f082313aa7cc674ed457207129c7c37aa3f3"} Dec 02 20:24:18 crc kubenswrapper[4807]: I1202 20:24:18.647358 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.647332348 podStartE2EDuration="2.647332348s" podCreationTimestamp="2025-12-02 20:24:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:24:18.640645439 +0000 UTC m=+1593.941552934" watchObservedRunningTime="2025-12-02 20:24:18.647332348 +0000 UTC m=+1593.948239833" Dec 02 20:24:18 crc kubenswrapper[4807]: I1202 20:24:18.938802 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 20:24:18 crc kubenswrapper[4807]: I1202 20:24:18.939309 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 20:24:22 crc kubenswrapper[4807]: I1202 20:24:22.941301 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 20:24:22 crc kubenswrapper[4807]: I1202 20:24:22.988681 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 20:24:23 crc kubenswrapper[4807]: I1202 20:24:23.747355 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 20:24:23 crc kubenswrapper[4807]: I1202 20:24:23.938579 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 20:24:23 crc kubenswrapper[4807]: I1202 20:24:23.938663 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 20:24:24 crc kubenswrapper[4807]: I1202 20:24:24.955878 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="96fcc760-d492-4e5b-8d31-de6c7f49b47f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.223:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 20:24:24 crc kubenswrapper[4807]: I1202 20:24:24.955896 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="96fcc760-d492-4e5b-8d31-de6c7f49b47f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.223:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 20:24:27 crc kubenswrapper[4807]: I1202 20:24:27.009491 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 20:24:27 crc kubenswrapper[4807]: I1202 20:24:27.009998 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 20:24:28 crc kubenswrapper[4807]: I1202 20:24:28.030062 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9caa6001-4e75-4042-998f-9f00f49ef173" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.224:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 20:24:28 crc kubenswrapper[4807]: I1202 20:24:28.030089 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9caa6001-4e75-4042-998f-9f00f49ef173" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.224:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 20:24:28 crc kubenswrapper[4807]: I1202 20:24:28.293185 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:24:28 crc kubenswrapper[4807]: I1202 20:24:28.293354 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:24:28 crc kubenswrapper[4807]: I1202 20:24:28.293470 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 20:24:28 crc kubenswrapper[4807]: I1202 20:24:28.294978 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 20:24:28 crc kubenswrapper[4807]: I1202 20:24:28.295080 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" gracePeriod=600 Dec 02 20:24:28 crc kubenswrapper[4807]: E1202 20:24:28.441010 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:24:28 crc kubenswrapper[4807]: I1202 20:24:28.760744 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" exitCode=0 Dec 02 20:24:28 crc kubenswrapper[4807]: I1202 20:24:28.760837 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129"} Dec 02 20:24:28 crc kubenswrapper[4807]: I1202 20:24:28.761248 4807 scope.go:117] "RemoveContainer" containerID="13f662bee9488997accc2766f7577233b423e1195e39b433e12fc85d986a041b" Dec 02 20:24:28 crc kubenswrapper[4807]: I1202 20:24:28.764077 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:24:28 crc kubenswrapper[4807]: E1202 20:24:28.767244 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:24:28 crc kubenswrapper[4807]: I1202 20:24:28.922194 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 20:24:33 crc kubenswrapper[4807]: I1202 20:24:33.948243 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 20:24:33 crc kubenswrapper[4807]: I1202 20:24:33.964106 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 20:24:33 crc kubenswrapper[4807]: I1202 20:24:33.968927 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 20:24:34 crc kubenswrapper[4807]: I1202 20:24:34.864827 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 20:24:37 crc kubenswrapper[4807]: I1202 20:24:37.019067 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 20:24:37 crc kubenswrapper[4807]: I1202 20:24:37.020488 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 20:24:37 crc kubenswrapper[4807]: I1202 20:24:37.024853 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 20:24:37 crc kubenswrapper[4807]: I1202 20:24:37.031414 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 20:24:37 crc kubenswrapper[4807]: I1202 20:24:37.899208 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 20:24:37 crc kubenswrapper[4807]: I1202 20:24:37.911322 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 20:24:41 crc kubenswrapper[4807]: I1202 20:24:41.973094 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:24:41 crc kubenswrapper[4807]: E1202 20:24:41.974475 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:24:46 crc kubenswrapper[4807]: I1202 20:24:46.882591 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 20:24:48 crc kubenswrapper[4807]: I1202 20:24:48.139186 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 20:24:51 crc kubenswrapper[4807]: I1202 20:24:51.759187 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="ef1b0038-6b67-47e8-92e8-06efe88df856" containerName="rabbitmq" containerID="cri-o://0af067dc0eb81760a184db00d47c1e1ecb454638303485d36be2f0bf5370e0f8" gracePeriod=604796 Dec 02 20:24:52 crc kubenswrapper[4807]: I1202 20:24:52.626113 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="2af53e2b-bee9-4bd4-aab6-e58d33057ac7" containerName="rabbitmq" containerID="cri-o://c1ee0e43b19e0a520fcaf9c1c2c70a52940c36cf3697ab9aee25e7039720e789" gracePeriod=604796 Dec 02 20:24:52 crc kubenswrapper[4807]: I1202 20:24:52.973520 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:24:52 crc kubenswrapper[4807]: E1202 20:24:52.973858 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:24:55 crc kubenswrapper[4807]: I1202 20:24:55.197847 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="ef1b0038-6b67-47e8-92e8-06efe88df856" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Dec 02 20:24:55 crc kubenswrapper[4807]: I1202 20:24:55.234486 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="2af53e2b-bee9-4bd4-aab6-e58d33057ac7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.223382 4807 generic.go:334] "Generic (PLEG): container finished" podID="ef1b0038-6b67-47e8-92e8-06efe88df856" containerID="0af067dc0eb81760a184db00d47c1e1ecb454638303485d36be2f0bf5370e0f8" exitCode=0 Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.223477 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef1b0038-6b67-47e8-92e8-06efe88df856","Type":"ContainerDied","Data":"0af067dc0eb81760a184db00d47c1e1ecb454638303485d36be2f0bf5370e0f8"} Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.387930 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.500792 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ef1b0038-6b67-47e8-92e8-06efe88df856\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.501000 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef1b0038-6b67-47e8-92e8-06efe88df856-pod-info\") pod \"ef1b0038-6b67-47e8-92e8-06efe88df856\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.501118 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vczf7\" (UniqueName: \"kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-kube-api-access-vczf7\") pod \"ef1b0038-6b67-47e8-92e8-06efe88df856\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.501239 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-confd\") pod \"ef1b0038-6b67-47e8-92e8-06efe88df856\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.501313 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-config-data\") pod \"ef1b0038-6b67-47e8-92e8-06efe88df856\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.501381 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-plugins\") pod \"ef1b0038-6b67-47e8-92e8-06efe88df856\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.501433 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-erlang-cookie\") pod \"ef1b0038-6b67-47e8-92e8-06efe88df856\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.501490 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef1b0038-6b67-47e8-92e8-06efe88df856-erlang-cookie-secret\") pod \"ef1b0038-6b67-47e8-92e8-06efe88df856\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.501542 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-plugins-conf\") pod \"ef1b0038-6b67-47e8-92e8-06efe88df856\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.501587 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-server-conf\") pod \"ef1b0038-6b67-47e8-92e8-06efe88df856\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.501648 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-tls\") pod \"ef1b0038-6b67-47e8-92e8-06efe88df856\" (UID: \"ef1b0038-6b67-47e8-92e8-06efe88df856\") " Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.511388 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "ef1b0038-6b67-47e8-92e8-06efe88df856" (UID: "ef1b0038-6b67-47e8-92e8-06efe88df856"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.512138 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ef1b0038-6b67-47e8-92e8-06efe88df856" (UID: "ef1b0038-6b67-47e8-92e8-06efe88df856"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.512481 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ef1b0038-6b67-47e8-92e8-06efe88df856" (UID: "ef1b0038-6b67-47e8-92e8-06efe88df856"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.512463 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ef1b0038-6b67-47e8-92e8-06efe88df856" (UID: "ef1b0038-6b67-47e8-92e8-06efe88df856"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.512583 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ef1b0038-6b67-47e8-92e8-06efe88df856-pod-info" (OuterVolumeSpecName: "pod-info") pod "ef1b0038-6b67-47e8-92e8-06efe88df856" (UID: "ef1b0038-6b67-47e8-92e8-06efe88df856"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.525851 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-kube-api-access-vczf7" (OuterVolumeSpecName: "kube-api-access-vczf7") pod "ef1b0038-6b67-47e8-92e8-06efe88df856" (UID: "ef1b0038-6b67-47e8-92e8-06efe88df856"). InnerVolumeSpecName "kube-api-access-vczf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.526280 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ef1b0038-6b67-47e8-92e8-06efe88df856" (UID: "ef1b0038-6b67-47e8-92e8-06efe88df856"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.541899 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1b0038-6b67-47e8-92e8-06efe88df856-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ef1b0038-6b67-47e8-92e8-06efe88df856" (UID: "ef1b0038-6b67-47e8-92e8-06efe88df856"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.551263 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-config-data" (OuterVolumeSpecName: "config-data") pod "ef1b0038-6b67-47e8-92e8-06efe88df856" (UID: "ef1b0038-6b67-47e8-92e8-06efe88df856"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.595521 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-server-conf" (OuterVolumeSpecName: "server-conf") pod "ef1b0038-6b67-47e8-92e8-06efe88df856" (UID: "ef1b0038-6b67-47e8-92e8-06efe88df856"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.606528 4807 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.606578 4807 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ef1b0038-6b67-47e8-92e8-06efe88df856-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.606590 4807 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.606602 4807 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.606611 4807 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.606638 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.606648 4807 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ef1b0038-6b67-47e8-92e8-06efe88df856-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.606658 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vczf7\" (UniqueName: \"kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-kube-api-access-vczf7\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.606667 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef1b0038-6b67-47e8-92e8-06efe88df856-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.606676 4807 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.674287 4807 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.709279 4807 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.719111 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ef1b0038-6b67-47e8-92e8-06efe88df856" (UID: "ef1b0038-6b67-47e8-92e8-06efe88df856"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:24:58 crc kubenswrapper[4807]: I1202 20:24:58.810876 4807 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ef1b0038-6b67-47e8-92e8-06efe88df856-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.228432 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.238423 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ef1b0038-6b67-47e8-92e8-06efe88df856","Type":"ContainerDied","Data":"152c8403c19d89f86582ad83cbb2995d1d787feb835481ee548f97d80216db79"} Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.238497 4807 scope.go:117] "RemoveContainer" containerID="0af067dc0eb81760a184db00d47c1e1ecb454638303485d36be2f0bf5370e0f8" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.238521 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.241507 4807 generic.go:334] "Generic (PLEG): container finished" podID="2af53e2b-bee9-4bd4-aab6-e58d33057ac7" containerID="c1ee0e43b19e0a520fcaf9c1c2c70a52940c36cf3697ab9aee25e7039720e789" exitCode=0 Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.241565 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2af53e2b-bee9-4bd4-aab6-e58d33057ac7","Type":"ContainerDied","Data":"c1ee0e43b19e0a520fcaf9c1c2c70a52940c36cf3697ab9aee25e7039720e789"} Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.241606 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"2af53e2b-bee9-4bd4-aab6-e58d33057ac7","Type":"ContainerDied","Data":"017f9569ab6a8f1f88a7ec92980fd56f03bac66b21db08931cfc99192a24a5f9"} Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.241632 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.280390 4807 scope.go:117] "RemoveContainer" containerID="7f285fca8e0cd1cf35353ce537c1073826621a4833a365e1187dd2c1899b4466" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.302988 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.312101 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.341558 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 20:24:59 crc kubenswrapper[4807]: E1202 20:24:59.344554 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af53e2b-bee9-4bd4-aab6-e58d33057ac7" containerName="setup-container" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.344586 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af53e2b-bee9-4bd4-aab6-e58d33057ac7" containerName="setup-container" Dec 02 20:24:59 crc kubenswrapper[4807]: E1202 20:24:59.344609 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1b0038-6b67-47e8-92e8-06efe88df856" containerName="rabbitmq" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.344616 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1b0038-6b67-47e8-92e8-06efe88df856" containerName="rabbitmq" Dec 02 20:24:59 crc kubenswrapper[4807]: E1202 20:24:59.344637 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af53e2b-bee9-4bd4-aab6-e58d33057ac7" containerName="rabbitmq" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.344643 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af53e2b-bee9-4bd4-aab6-e58d33057ac7" containerName="rabbitmq" Dec 02 20:24:59 crc kubenswrapper[4807]: E1202 20:24:59.344658 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1b0038-6b67-47e8-92e8-06efe88df856" containerName="setup-container" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.344665 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1b0038-6b67-47e8-92e8-06efe88df856" containerName="setup-container" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.344887 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af53e2b-bee9-4bd4-aab6-e58d33057ac7" containerName="rabbitmq" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.344904 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef1b0038-6b67-47e8-92e8-06efe88df856" containerName="rabbitmq" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.346143 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.350134 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.350242 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.350347 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.350411 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-p86rf" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.350460 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.350134 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.351543 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.357079 4807 scope.go:117] "RemoveContainer" containerID="c1ee0e43b19e0a520fcaf9c1c2c70a52940c36cf3697ab9aee25e7039720e789" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.373635 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.421909 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-plugins-conf\") pod \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.423860 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2af53e2b-bee9-4bd4-aab6-e58d33057ac7" (UID: "2af53e2b-bee9-4bd4-aab6-e58d33057ac7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.423974 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-tls\") pod \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.425615 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-config-data\") pod \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.428693 4807 scope.go:117] "RemoveContainer" containerID="3a29642ba9086cef7d3fa427a48598985d20fd9275fb14e394c2f925dbccb1cb" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.429530 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-plugins\") pod \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.429575 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-erlang-cookie\") pod \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.429648 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.429689 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl7c9\" (UniqueName: \"kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-kube-api-access-xl7c9\") pod \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.429776 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-pod-info\") pod \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.429882 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-erlang-cookie-secret\") pod \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.429922 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-confd\") pod \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.429953 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-server-conf\") pod \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\" (UID: \"2af53e2b-bee9-4bd4-aab6-e58d33057ac7\") " Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.431887 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2af53e2b-bee9-4bd4-aab6-e58d33057ac7" (UID: "2af53e2b-bee9-4bd4-aab6-e58d33057ac7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.432303 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2af53e2b-bee9-4bd4-aab6-e58d33057ac7" (UID: "2af53e2b-bee9-4bd4-aab6-e58d33057ac7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.435402 4807 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.450030 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-kube-api-access-xl7c9" (OuterVolumeSpecName: "kube-api-access-xl7c9") pod "2af53e2b-bee9-4bd4-aab6-e58d33057ac7" (UID: "2af53e2b-bee9-4bd4-aab6-e58d33057ac7"). InnerVolumeSpecName "kube-api-access-xl7c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.450508 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2af53e2b-bee9-4bd4-aab6-e58d33057ac7" (UID: "2af53e2b-bee9-4bd4-aab6-e58d33057ac7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.450605 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2af53e2b-bee9-4bd4-aab6-e58d33057ac7" (UID: "2af53e2b-bee9-4bd4-aab6-e58d33057ac7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.463055 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "2af53e2b-bee9-4bd4-aab6-e58d33057ac7" (UID: "2af53e2b-bee9-4bd4-aab6-e58d33057ac7"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.463472 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-pod-info" (OuterVolumeSpecName: "pod-info") pod "2af53e2b-bee9-4bd4-aab6-e58d33057ac7" (UID: "2af53e2b-bee9-4bd4-aab6-e58d33057ac7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.493547 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-config-data" (OuterVolumeSpecName: "config-data") pod "2af53e2b-bee9-4bd4-aab6-e58d33057ac7" (UID: "2af53e2b-bee9-4bd4-aab6-e58d33057ac7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.537921 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8593e062-85d8-4f22-88b4-eb7cf5654859-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.538444 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8593e062-85d8-4f22-88b4-eb7cf5654859-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.538960 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8593e062-85d8-4f22-88b4-eb7cf5654859-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.539011 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8593e062-85d8-4f22-88b4-eb7cf5654859-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.539068 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8593e062-85d8-4f22-88b4-eb7cf5654859-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.539102 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8593e062-85d8-4f22-88b4-eb7cf5654859-config-data\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.539130 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj8ld\" (UniqueName: \"kubernetes.io/projected/8593e062-85d8-4f22-88b4-eb7cf5654859-kube-api-access-xj8ld\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.539164 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8593e062-85d8-4f22-88b4-eb7cf5654859-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.539321 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.539368 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8593e062-85d8-4f22-88b4-eb7cf5654859-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.539418 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8593e062-85d8-4f22-88b4-eb7cf5654859-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.539497 4807 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.539512 4807 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.539525 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.539537 4807 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.539549 4807 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.539580 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.539591 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl7c9\" (UniqueName: \"kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-kube-api-access-xl7c9\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.539604 4807 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.540158 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-server-conf" (OuterVolumeSpecName: "server-conf") pod "2af53e2b-bee9-4bd4-aab6-e58d33057ac7" (UID: "2af53e2b-bee9-4bd4-aab6-e58d33057ac7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.540399 4807 scope.go:117] "RemoveContainer" containerID="c1ee0e43b19e0a520fcaf9c1c2c70a52940c36cf3697ab9aee25e7039720e789" Dec 02 20:24:59 crc kubenswrapper[4807]: E1202 20:24:59.543433 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1ee0e43b19e0a520fcaf9c1c2c70a52940c36cf3697ab9aee25e7039720e789\": container with ID starting with c1ee0e43b19e0a520fcaf9c1c2c70a52940c36cf3697ab9aee25e7039720e789 not found: ID does not exist" containerID="c1ee0e43b19e0a520fcaf9c1c2c70a52940c36cf3697ab9aee25e7039720e789" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.543491 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1ee0e43b19e0a520fcaf9c1c2c70a52940c36cf3697ab9aee25e7039720e789"} err="failed to get container status \"c1ee0e43b19e0a520fcaf9c1c2c70a52940c36cf3697ab9aee25e7039720e789\": rpc error: code = NotFound desc = could not find container \"c1ee0e43b19e0a520fcaf9c1c2c70a52940c36cf3697ab9aee25e7039720e789\": container with ID starting with c1ee0e43b19e0a520fcaf9c1c2c70a52940c36cf3697ab9aee25e7039720e789 not found: ID does not exist" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.543535 4807 scope.go:117] "RemoveContainer" containerID="3a29642ba9086cef7d3fa427a48598985d20fd9275fb14e394c2f925dbccb1cb" Dec 02 20:24:59 crc kubenswrapper[4807]: E1202 20:24:59.544961 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a29642ba9086cef7d3fa427a48598985d20fd9275fb14e394c2f925dbccb1cb\": container with ID starting with 3a29642ba9086cef7d3fa427a48598985d20fd9275fb14e394c2f925dbccb1cb not found: ID does not exist" containerID="3a29642ba9086cef7d3fa427a48598985d20fd9275fb14e394c2f925dbccb1cb" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.544990 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a29642ba9086cef7d3fa427a48598985d20fd9275fb14e394c2f925dbccb1cb"} err="failed to get container status \"3a29642ba9086cef7d3fa427a48598985d20fd9275fb14e394c2f925dbccb1cb\": rpc error: code = NotFound desc = could not find container \"3a29642ba9086cef7d3fa427a48598985d20fd9275fb14e394c2f925dbccb1cb\": container with ID starting with 3a29642ba9086cef7d3fa427a48598985d20fd9275fb14e394c2f925dbccb1cb not found: ID does not exist" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.575193 4807 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.621943 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2af53e2b-bee9-4bd4-aab6-e58d33057ac7" (UID: "2af53e2b-bee9-4bd4-aab6-e58d33057ac7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.641777 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.641839 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8593e062-85d8-4f22-88b4-eb7cf5654859-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.641871 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8593e062-85d8-4f22-88b4-eb7cf5654859-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.641889 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8593e062-85d8-4f22-88b4-eb7cf5654859-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.641915 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8593e062-85d8-4f22-88b4-eb7cf5654859-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.641952 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8593e062-85d8-4f22-88b4-eb7cf5654859-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.641975 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8593e062-85d8-4f22-88b4-eb7cf5654859-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.642004 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8593e062-85d8-4f22-88b4-eb7cf5654859-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.642025 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8593e062-85d8-4f22-88b4-eb7cf5654859-config-data\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.642039 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj8ld\" (UniqueName: \"kubernetes.io/projected/8593e062-85d8-4f22-88b4-eb7cf5654859-kube-api-access-xj8ld\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.642059 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8593e062-85d8-4f22-88b4-eb7cf5654859-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.642148 4807 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.642160 4807 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.642172 4807 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2af53e2b-bee9-4bd4-aab6-e58d33057ac7-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.643034 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8593e062-85d8-4f22-88b4-eb7cf5654859-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.643075 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8593e062-85d8-4f22-88b4-eb7cf5654859-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.643326 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.643943 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8593e062-85d8-4f22-88b4-eb7cf5654859-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.644751 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8593e062-85d8-4f22-88b4-eb7cf5654859-config-data\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.644840 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8593e062-85d8-4f22-88b4-eb7cf5654859-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.647745 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8593e062-85d8-4f22-88b4-eb7cf5654859-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.648121 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8593e062-85d8-4f22-88b4-eb7cf5654859-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.648404 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8593e062-85d8-4f22-88b4-eb7cf5654859-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.650778 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8593e062-85d8-4f22-88b4-eb7cf5654859-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.665016 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj8ld\" (UniqueName: \"kubernetes.io/projected/8593e062-85d8-4f22-88b4-eb7cf5654859-kube-api-access-xj8ld\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.684249 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"8593e062-85d8-4f22-88b4-eb7cf5654859\") " pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.730929 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.883277 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.896447 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.922548 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.935059 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.940209 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.940326 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-nrxx2" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.940566 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.940618 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.941017 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.941750 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.948931 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 20:24:59 crc kubenswrapper[4807]: I1202 20:24:59.962310 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.055109 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/748ead81-bff5-4a69-9398-4e3c91be5979-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.055178 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sssrq\" (UniqueName: \"kubernetes.io/projected/748ead81-bff5-4a69-9398-4e3c91be5979-kube-api-access-sssrq\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.055205 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/748ead81-bff5-4a69-9398-4e3c91be5979-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.055228 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/748ead81-bff5-4a69-9398-4e3c91be5979-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.055263 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/748ead81-bff5-4a69-9398-4e3c91be5979-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.055295 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/748ead81-bff5-4a69-9398-4e3c91be5979-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.055322 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/748ead81-bff5-4a69-9398-4e3c91be5979-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.055372 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/748ead81-bff5-4a69-9398-4e3c91be5979-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.055407 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.055469 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/748ead81-bff5-4a69-9398-4e3c91be5979-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.055495 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/748ead81-bff5-4a69-9398-4e3c91be5979-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.157138 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/748ead81-bff5-4a69-9398-4e3c91be5979-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.157221 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.157286 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/748ead81-bff5-4a69-9398-4e3c91be5979-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.157344 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/748ead81-bff5-4a69-9398-4e3c91be5979-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.157373 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/748ead81-bff5-4a69-9398-4e3c91be5979-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.157894 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.157904 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/748ead81-bff5-4a69-9398-4e3c91be5979-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.158031 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sssrq\" (UniqueName: \"kubernetes.io/projected/748ead81-bff5-4a69-9398-4e3c91be5979-kube-api-access-sssrq\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.158052 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/748ead81-bff5-4a69-9398-4e3c91be5979-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.158074 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/748ead81-bff5-4a69-9398-4e3c91be5979-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.158107 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/748ead81-bff5-4a69-9398-4e3c91be5979-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.158136 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/748ead81-bff5-4a69-9398-4e3c91be5979-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.158161 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/748ead81-bff5-4a69-9398-4e3c91be5979-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.158259 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/748ead81-bff5-4a69-9398-4e3c91be5979-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.159182 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/748ead81-bff5-4a69-9398-4e3c91be5979-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.159395 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/748ead81-bff5-4a69-9398-4e3c91be5979-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.160119 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/748ead81-bff5-4a69-9398-4e3c91be5979-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.173600 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/748ead81-bff5-4a69-9398-4e3c91be5979-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.174850 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/748ead81-bff5-4a69-9398-4e3c91be5979-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.183862 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/748ead81-bff5-4a69-9398-4e3c91be5979-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.191457 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/748ead81-bff5-4a69-9398-4e3c91be5979-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.206561 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sssrq\" (UniqueName: \"kubernetes.io/projected/748ead81-bff5-4a69-9398-4e3c91be5979-kube-api-access-sssrq\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.219727 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.284357 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"748ead81-bff5-4a69-9398-4e3c91be5979\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.306097 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8593e062-85d8-4f22-88b4-eb7cf5654859","Type":"ContainerStarted","Data":"1ba29099df668e0336af23adcae3989cff2b61a9aa1dd38eb3c88b63b420a919"} Dec 02 20:25:00 crc kubenswrapper[4807]: I1202 20:25:00.559922 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:01 crc kubenswrapper[4807]: I1202 20:25:01.020817 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af53e2b-bee9-4bd4-aab6-e58d33057ac7" path="/var/lib/kubelet/pods/2af53e2b-bee9-4bd4-aab6-e58d33057ac7/volumes" Dec 02 20:25:01 crc kubenswrapper[4807]: I1202 20:25:01.022108 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef1b0038-6b67-47e8-92e8-06efe88df856" path="/var/lib/kubelet/pods/ef1b0038-6b67-47e8-92e8-06efe88df856/volumes" Dec 02 20:25:01 crc kubenswrapper[4807]: I1202 20:25:01.066750 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 20:25:01 crc kubenswrapper[4807]: W1202 20:25:01.073175 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod748ead81_bff5_4a69_9398_4e3c91be5979.slice/crio-4c0bde91874417e4e311b99a2e943fdf25de10431b6cd199ab31ab573832dcc3 WatchSource:0}: Error finding container 4c0bde91874417e4e311b99a2e943fdf25de10431b6cd199ab31ab573832dcc3: Status 404 returned error can't find the container with id 4c0bde91874417e4e311b99a2e943fdf25de10431b6cd199ab31ab573832dcc3 Dec 02 20:25:01 crc kubenswrapper[4807]: I1202 20:25:01.362019 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"748ead81-bff5-4a69-9398-4e3c91be5979","Type":"ContainerStarted","Data":"4c0bde91874417e4e311b99a2e943fdf25de10431b6cd199ab31ab573832dcc3"} Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.247028 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-krdfx"] Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.250130 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.256896 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.275066 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-krdfx"] Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.420274 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.420339 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.420616 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-dns-svc\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.420705 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxfqz\" (UniqueName: \"kubernetes.io/projected/e07593fd-c413-4ddb-9107-99c3108451ed-kube-api-access-mxfqz\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.421060 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.421119 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.421316 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-config\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.524326 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.524384 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.524432 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-dns-svc\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.524456 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxfqz\" (UniqueName: \"kubernetes.io/projected/e07593fd-c413-4ddb-9107-99c3108451ed-kube-api-access-mxfqz\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.524531 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.524555 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.524602 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-config\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.525415 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.526398 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-config\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.528536 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.529774 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.530043 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-dns-svc\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.532940 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.554522 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxfqz\" (UniqueName: \"kubernetes.io/projected/e07593fd-c413-4ddb-9107-99c3108451ed-kube-api-access-mxfqz\") pod \"dnsmasq-dns-d558885bc-krdfx\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:02 crc kubenswrapper[4807]: I1202 20:25:02.576290 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:03 crc kubenswrapper[4807]: I1202 20:25:03.387900 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8593e062-85d8-4f22-88b4-eb7cf5654859","Type":"ContainerStarted","Data":"c550845b76b5bed6609c55fffa998b03d5ed227445b6c9d24cf0d59b28e7d97c"} Dec 02 20:25:03 crc kubenswrapper[4807]: I1202 20:25:03.406087 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"748ead81-bff5-4a69-9398-4e3c91be5979","Type":"ContainerStarted","Data":"114400a1b22b38d42d8258af29c7c666194962443fba1f1ea566d7eaa8ec7dde"} Dec 02 20:25:03 crc kubenswrapper[4807]: I1202 20:25:03.411117 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-krdfx"] Dec 02 20:25:03 crc kubenswrapper[4807]: W1202 20:25:03.430104 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode07593fd_c413_4ddb_9107_99c3108451ed.slice/crio-1f50991c351f51403d3e460a4d514af442819b83f5a59c7eb8c2ed7b99eaa238 WatchSource:0}: Error finding container 1f50991c351f51403d3e460a4d514af442819b83f5a59c7eb8c2ed7b99eaa238: Status 404 returned error can't find the container with id 1f50991c351f51403d3e460a4d514af442819b83f5a59c7eb8c2ed7b99eaa238 Dec 02 20:25:04 crc kubenswrapper[4807]: I1202 20:25:04.424618 4807 generic.go:334] "Generic (PLEG): container finished" podID="e07593fd-c413-4ddb-9107-99c3108451ed" containerID="4928cd86a492b5f4aa808ab81056fb50f222addc7d6877de3dd175811dffeee7" exitCode=0 Dec 02 20:25:04 crc kubenswrapper[4807]: I1202 20:25:04.424969 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-krdfx" event={"ID":"e07593fd-c413-4ddb-9107-99c3108451ed","Type":"ContainerDied","Data":"4928cd86a492b5f4aa808ab81056fb50f222addc7d6877de3dd175811dffeee7"} Dec 02 20:25:04 crc kubenswrapper[4807]: I1202 20:25:04.425130 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-krdfx" event={"ID":"e07593fd-c413-4ddb-9107-99c3108451ed","Type":"ContainerStarted","Data":"1f50991c351f51403d3e460a4d514af442819b83f5a59c7eb8c2ed7b99eaa238"} Dec 02 20:25:05 crc kubenswrapper[4807]: I1202 20:25:05.443779 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-krdfx" event={"ID":"e07593fd-c413-4ddb-9107-99c3108451ed","Type":"ContainerStarted","Data":"e47356a5686d687716b7d389a137fe06dfc90d87c239bf00df1d81a57c695ca3"} Dec 02 20:25:05 crc kubenswrapper[4807]: I1202 20:25:05.444314 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:05 crc kubenswrapper[4807]: I1202 20:25:05.487242 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-krdfx" podStartSLOduration=3.487212811 podStartE2EDuration="3.487212811s" podCreationTimestamp="2025-12-02 20:25:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:25:05.475668734 +0000 UTC m=+1640.776576259" watchObservedRunningTime="2025-12-02 20:25:05.487212811 +0000 UTC m=+1640.788120326" Dec 02 20:25:06 crc kubenswrapper[4807]: I1202 20:25:06.973336 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:25:06 crc kubenswrapper[4807]: E1202 20:25:06.974649 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:25:12 crc kubenswrapper[4807]: I1202 20:25:12.577974 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:12 crc kubenswrapper[4807]: I1202 20:25:12.680650 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vl9gx"] Dec 02 20:25:12 crc kubenswrapper[4807]: I1202 20:25:12.681400 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" podUID="b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1" containerName="dnsmasq-dns" containerID="cri-o://c52f1c38d3609bfa0a3f3212cbfd824968d703e5eef1630e447e8792c09b7900" gracePeriod=10 Dec 02 20:25:12 crc kubenswrapper[4807]: I1202 20:25:12.887368 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b865b64bc-72s4n"] Dec 02 20:25:12 crc kubenswrapper[4807]: I1202 20:25:12.893188 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:12 crc kubenswrapper[4807]: I1202 20:25:12.920676 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b865b64bc-72s4n"] Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.036764 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-dns-svc\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.036835 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-ovsdbserver-nb\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.036898 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-ovsdbserver-sb\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.037127 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-dns-swift-storage-0\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.037193 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvbkw\" (UniqueName: \"kubernetes.io/projected/7cb6399e-e732-4865-9d51-5f15eb42c502-kube-api-access-hvbkw\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.037237 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.037478 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-config\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.141968 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-dns-svc\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.142750 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-ovsdbserver-nb\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.142944 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-ovsdbserver-sb\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.143259 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-dns-swift-storage-0\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.143304 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvbkw\" (UniqueName: \"kubernetes.io/projected/7cb6399e-e732-4865-9d51-5f15eb42c502-kube-api-access-hvbkw\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.143370 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.143533 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-config\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.144264 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-dns-svc\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.145302 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.146493 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-config\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.147319 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-ovsdbserver-nb\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.147344 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-dns-swift-storage-0\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.149992 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cb6399e-e732-4865-9d51-5f15eb42c502-ovsdbserver-sb\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.190219 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvbkw\" (UniqueName: \"kubernetes.io/projected/7cb6399e-e732-4865-9d51-5f15eb42c502-kube-api-access-hvbkw\") pod \"dnsmasq-dns-6b865b64bc-72s4n\" (UID: \"7cb6399e-e732-4865-9d51-5f15eb42c502\") " pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.219801 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.348525 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.448580 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-ovsdbserver-nb\") pod \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.449124 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-ovsdbserver-sb\") pod \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.449177 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-dns-swift-storage-0\") pod \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.449255 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-config\") pod \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.449285 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-dns-svc\") pod \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.449410 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmfg7\" (UniqueName: \"kubernetes.io/projected/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-kube-api-access-dmfg7\") pod \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\" (UID: \"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1\") " Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.463906 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-kube-api-access-dmfg7" (OuterVolumeSpecName: "kube-api-access-dmfg7") pod "b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1" (UID: "b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1"). InnerVolumeSpecName "kube-api-access-dmfg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.517562 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1" (UID: "b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.528954 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1" (UID: "b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.536704 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1" (UID: "b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.543463 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1" (UID: "b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.543674 4807 generic.go:334] "Generic (PLEG): container finished" podID="b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1" containerID="c52f1c38d3609bfa0a3f3212cbfd824968d703e5eef1630e447e8792c09b7900" exitCode=0 Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.543756 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" event={"ID":"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1","Type":"ContainerDied","Data":"c52f1c38d3609bfa0a3f3212cbfd824968d703e5eef1630e447e8792c09b7900"} Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.543784 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.543831 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-vl9gx" event={"ID":"b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1","Type":"ContainerDied","Data":"9621e06487dfd855b1de8f2e360b0f888af0cbace331c994d8edfe154214e216"} Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.543859 4807 scope.go:117] "RemoveContainer" containerID="c52f1c38d3609bfa0a3f3212cbfd824968d703e5eef1630e447e8792c09b7900" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.553544 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmfg7\" (UniqueName: \"kubernetes.io/projected/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-kube-api-access-dmfg7\") on node \"crc\" DevicePath \"\"" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.553584 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.553594 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.553604 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.553614 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.562629 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-config" (OuterVolumeSpecName: "config") pod "b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1" (UID: "b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.570732 4807 scope.go:117] "RemoveContainer" containerID="fee7d5fb73660da659de84b8f792576de6fb7e5b289f4b5f528ff02021381118" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.608362 4807 scope.go:117] "RemoveContainer" containerID="c52f1c38d3609bfa0a3f3212cbfd824968d703e5eef1630e447e8792c09b7900" Dec 02 20:25:13 crc kubenswrapper[4807]: E1202 20:25:13.609164 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c52f1c38d3609bfa0a3f3212cbfd824968d703e5eef1630e447e8792c09b7900\": container with ID starting with c52f1c38d3609bfa0a3f3212cbfd824968d703e5eef1630e447e8792c09b7900 not found: ID does not exist" containerID="c52f1c38d3609bfa0a3f3212cbfd824968d703e5eef1630e447e8792c09b7900" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.609203 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c52f1c38d3609bfa0a3f3212cbfd824968d703e5eef1630e447e8792c09b7900"} err="failed to get container status \"c52f1c38d3609bfa0a3f3212cbfd824968d703e5eef1630e447e8792c09b7900\": rpc error: code = NotFound desc = could not find container \"c52f1c38d3609bfa0a3f3212cbfd824968d703e5eef1630e447e8792c09b7900\": container with ID starting with c52f1c38d3609bfa0a3f3212cbfd824968d703e5eef1630e447e8792c09b7900 not found: ID does not exist" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.609248 4807 scope.go:117] "RemoveContainer" containerID="fee7d5fb73660da659de84b8f792576de6fb7e5b289f4b5f528ff02021381118" Dec 02 20:25:13 crc kubenswrapper[4807]: E1202 20:25:13.609476 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee7d5fb73660da659de84b8f792576de6fb7e5b289f4b5f528ff02021381118\": container with ID starting with fee7d5fb73660da659de84b8f792576de6fb7e5b289f4b5f528ff02021381118 not found: ID does not exist" containerID="fee7d5fb73660da659de84b8f792576de6fb7e5b289f4b5f528ff02021381118" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.609499 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee7d5fb73660da659de84b8f792576de6fb7e5b289f4b5f528ff02021381118"} err="failed to get container status \"fee7d5fb73660da659de84b8f792576de6fb7e5b289f4b5f528ff02021381118\": rpc error: code = NotFound desc = could not find container \"fee7d5fb73660da659de84b8f792576de6fb7e5b289f4b5f528ff02021381118\": container with ID starting with fee7d5fb73660da659de84b8f792576de6fb7e5b289f4b5f528ff02021381118 not found: ID does not exist" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.655895 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:25:13 crc kubenswrapper[4807]: I1202 20:25:13.777443 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b865b64bc-72s4n"] Dec 02 20:25:14 crc kubenswrapper[4807]: I1202 20:25:14.015463 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vl9gx"] Dec 02 20:25:14 crc kubenswrapper[4807]: I1202 20:25:14.028480 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-vl9gx"] Dec 02 20:25:14 crc kubenswrapper[4807]: I1202 20:25:14.558597 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" event={"ID":"7cb6399e-e732-4865-9d51-5f15eb42c502","Type":"ContainerDied","Data":"6c301d44a8e82986a7f98c8b8d0868a1cc49a6c8e45383ca0b97870dcbc714a0"} Dec 02 20:25:14 crc kubenswrapper[4807]: I1202 20:25:14.558396 4807 generic.go:334] "Generic (PLEG): container finished" podID="7cb6399e-e732-4865-9d51-5f15eb42c502" containerID="6c301d44a8e82986a7f98c8b8d0868a1cc49a6c8e45383ca0b97870dcbc714a0" exitCode=0 Dec 02 20:25:14 crc kubenswrapper[4807]: I1202 20:25:14.559759 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" event={"ID":"7cb6399e-e732-4865-9d51-5f15eb42c502","Type":"ContainerStarted","Data":"9679314e5a86c9629b96ca2d2c0543f270fc48c52fb3a432137daa7b59037d75"} Dec 02 20:25:14 crc kubenswrapper[4807]: I1202 20:25:14.989219 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1" path="/var/lib/kubelet/pods/b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1/volumes" Dec 02 20:25:15 crc kubenswrapper[4807]: I1202 20:25:15.575272 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" event={"ID":"7cb6399e-e732-4865-9d51-5f15eb42c502","Type":"ContainerStarted","Data":"11f091c6e23415504e729642f18046e8e136a48ecf80dbbcbd06a0c0603d4d23"} Dec 02 20:25:15 crc kubenswrapper[4807]: I1202 20:25:15.575876 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:15 crc kubenswrapper[4807]: I1202 20:25:15.603083 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" podStartSLOduration=3.603052386 podStartE2EDuration="3.603052386s" podCreationTimestamp="2025-12-02 20:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:25:15.595195079 +0000 UTC m=+1650.896102585" watchObservedRunningTime="2025-12-02 20:25:15.603052386 +0000 UTC m=+1650.903959881" Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.143169 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zmrhz"] Dec 02 20:25:18 crc kubenswrapper[4807]: E1202 20:25:18.144231 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1" containerName="init" Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.144248 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1" containerName="init" Dec 02 20:25:18 crc kubenswrapper[4807]: E1202 20:25:18.144287 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1" containerName="dnsmasq-dns" Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.144295 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1" containerName="dnsmasq-dns" Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.144514 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5c801bb-f6e2-4c8f-855e-eb5eb060d6b1" containerName="dnsmasq-dns" Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.151791 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.186058 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zmrhz"] Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.280884 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-utilities\") pod \"redhat-operators-zmrhz\" (UID: \"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618\") " pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.280951 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-catalog-content\") pod \"redhat-operators-zmrhz\" (UID: \"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618\") " pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.280996 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtxxb\" (UniqueName: \"kubernetes.io/projected/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-kube-api-access-rtxxb\") pod \"redhat-operators-zmrhz\" (UID: \"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618\") " pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.383561 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-utilities\") pod \"redhat-operators-zmrhz\" (UID: \"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618\") " pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.383625 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-catalog-content\") pod \"redhat-operators-zmrhz\" (UID: \"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618\") " pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.383667 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtxxb\" (UniqueName: \"kubernetes.io/projected/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-kube-api-access-rtxxb\") pod \"redhat-operators-zmrhz\" (UID: \"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618\") " pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.384216 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-utilities\") pod \"redhat-operators-zmrhz\" (UID: \"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618\") " pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.384287 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-catalog-content\") pod \"redhat-operators-zmrhz\" (UID: \"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618\") " pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.437771 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtxxb\" (UniqueName: \"kubernetes.io/projected/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-kube-api-access-rtxxb\") pod \"redhat-operators-zmrhz\" (UID: \"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618\") " pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.472238 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.974002 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:25:18 crc kubenswrapper[4807]: E1202 20:25:18.974832 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:25:18 crc kubenswrapper[4807]: W1202 20:25:18.985532 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b2af9b5_15ce_4b6c_b43c_a0d2035dc618.slice/crio-4ce3ae50826ee3f9d4be93ab561f06dea9e76d1c3d11ff7771b8f73245bb57c5 WatchSource:0}: Error finding container 4ce3ae50826ee3f9d4be93ab561f06dea9e76d1c3d11ff7771b8f73245bb57c5: Status 404 returned error can't find the container with id 4ce3ae50826ee3f9d4be93ab561f06dea9e76d1c3d11ff7771b8f73245bb57c5 Dec 02 20:25:18 crc kubenswrapper[4807]: I1202 20:25:18.988008 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zmrhz"] Dec 02 20:25:19 crc kubenswrapper[4807]: I1202 20:25:19.635086 4807 generic.go:334] "Generic (PLEG): container finished" podID="5b2af9b5-15ce-4b6c-b43c-a0d2035dc618" containerID="67bd3a34f15865fb893a7d2973e8b76a534adde459e04a7bc5006a6a9f5a97d0" exitCode=0 Dec 02 20:25:19 crc kubenswrapper[4807]: I1202 20:25:19.635192 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmrhz" event={"ID":"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618","Type":"ContainerDied","Data":"67bd3a34f15865fb893a7d2973e8b76a534adde459e04a7bc5006a6a9f5a97d0"} Dec 02 20:25:19 crc kubenswrapper[4807]: I1202 20:25:19.635569 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmrhz" event={"ID":"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618","Type":"ContainerStarted","Data":"4ce3ae50826ee3f9d4be93ab561f06dea9e76d1c3d11ff7771b8f73245bb57c5"} Dec 02 20:25:21 crc kubenswrapper[4807]: I1202 20:25:21.665266 4807 generic.go:334] "Generic (PLEG): container finished" podID="5b2af9b5-15ce-4b6c-b43c-a0d2035dc618" containerID="9ca05e747a2c826f8646fa7c09d1bf52981298580c80896c4a7d39dd2ad18a2c" exitCode=0 Dec 02 20:25:21 crc kubenswrapper[4807]: I1202 20:25:21.665648 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmrhz" event={"ID":"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618","Type":"ContainerDied","Data":"9ca05e747a2c826f8646fa7c09d1bf52981298580c80896c4a7d39dd2ad18a2c"} Dec 02 20:25:23 crc kubenswrapper[4807]: I1202 20:25:23.222562 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b865b64bc-72s4n" Dec 02 20:25:23 crc kubenswrapper[4807]: I1202 20:25:23.319791 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-krdfx"] Dec 02 20:25:23 crc kubenswrapper[4807]: I1202 20:25:23.320963 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-krdfx" podUID="e07593fd-c413-4ddb-9107-99c3108451ed" containerName="dnsmasq-dns" containerID="cri-o://e47356a5686d687716b7d389a137fe06dfc90d87c239bf00df1d81a57c695ca3" gracePeriod=10 Dec 02 20:25:23 crc kubenswrapper[4807]: I1202 20:25:23.708914 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmrhz" event={"ID":"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618","Type":"ContainerStarted","Data":"c79c1a1f16efe54a4565262360773b2101d3c14d5acc310eacfd6d078587a7eb"} Dec 02 20:25:23 crc kubenswrapper[4807]: I1202 20:25:23.736693 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zmrhz" podStartSLOduration=2.670105899 podStartE2EDuration="5.736655366s" podCreationTimestamp="2025-12-02 20:25:18 +0000 UTC" firstStartedPulling="2025-12-02 20:25:19.637917915 +0000 UTC m=+1654.938825410" lastFinishedPulling="2025-12-02 20:25:22.704467372 +0000 UTC m=+1658.005374877" observedRunningTime="2025-12-02 20:25:23.727738927 +0000 UTC m=+1659.028646422" watchObservedRunningTime="2025-12-02 20:25:23.736655366 +0000 UTC m=+1659.037562861" Dec 02 20:25:24 crc kubenswrapper[4807]: I1202 20:25:24.746520 4807 generic.go:334] "Generic (PLEG): container finished" podID="e07593fd-c413-4ddb-9107-99c3108451ed" containerID="e47356a5686d687716b7d389a137fe06dfc90d87c239bf00df1d81a57c695ca3" exitCode=0 Dec 02 20:25:24 crc kubenswrapper[4807]: I1202 20:25:24.746624 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-krdfx" event={"ID":"e07593fd-c413-4ddb-9107-99c3108451ed","Type":"ContainerDied","Data":"e47356a5686d687716b7d389a137fe06dfc90d87c239bf00df1d81a57c695ca3"} Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.186626 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.249945 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-ovsdbserver-sb\") pod \"e07593fd-c413-4ddb-9107-99c3108451ed\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.250144 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-dns-svc\") pod \"e07593fd-c413-4ddb-9107-99c3108451ed\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.250166 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-config\") pod \"e07593fd-c413-4ddb-9107-99c3108451ed\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.251003 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-openstack-edpm-ipam\") pod \"e07593fd-c413-4ddb-9107-99c3108451ed\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.251076 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-ovsdbserver-nb\") pod \"e07593fd-c413-4ddb-9107-99c3108451ed\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.251145 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-dns-swift-storage-0\") pod \"e07593fd-c413-4ddb-9107-99c3108451ed\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.251176 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxfqz\" (UniqueName: \"kubernetes.io/projected/e07593fd-c413-4ddb-9107-99c3108451ed-kube-api-access-mxfqz\") pod \"e07593fd-c413-4ddb-9107-99c3108451ed\" (UID: \"e07593fd-c413-4ddb-9107-99c3108451ed\") " Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.274091 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e07593fd-c413-4ddb-9107-99c3108451ed-kube-api-access-mxfqz" (OuterVolumeSpecName: "kube-api-access-mxfqz") pod "e07593fd-c413-4ddb-9107-99c3108451ed" (UID: "e07593fd-c413-4ddb-9107-99c3108451ed"). InnerVolumeSpecName "kube-api-access-mxfqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.323232 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e07593fd-c413-4ddb-9107-99c3108451ed" (UID: "e07593fd-c413-4ddb-9107-99c3108451ed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.338818 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e07593fd-c413-4ddb-9107-99c3108451ed" (UID: "e07593fd-c413-4ddb-9107-99c3108451ed"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.344663 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "e07593fd-c413-4ddb-9107-99c3108451ed" (UID: "e07593fd-c413-4ddb-9107-99c3108451ed"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.348392 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e07593fd-c413-4ddb-9107-99c3108451ed" (UID: "e07593fd-c413-4ddb-9107-99c3108451ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.349935 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-config" (OuterVolumeSpecName: "config") pod "e07593fd-c413-4ddb-9107-99c3108451ed" (UID: "e07593fd-c413-4ddb-9107-99c3108451ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.351536 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e07593fd-c413-4ddb-9107-99c3108451ed" (UID: "e07593fd-c413-4ddb-9107-99c3108451ed"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.354271 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.354333 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.354357 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.354369 4807 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.354382 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.354394 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e07593fd-c413-4ddb-9107-99c3108451ed-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.354408 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxfqz\" (UniqueName: \"kubernetes.io/projected/e07593fd-c413-4ddb-9107-99c3108451ed-kube-api-access-mxfqz\") on node \"crc\" DevicePath \"\"" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.763746 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-krdfx" event={"ID":"e07593fd-c413-4ddb-9107-99c3108451ed","Type":"ContainerDied","Data":"1f50991c351f51403d3e460a4d514af442819b83f5a59c7eb8c2ed7b99eaa238"} Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.763860 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-krdfx" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.764239 4807 scope.go:117] "RemoveContainer" containerID="e47356a5686d687716b7d389a137fe06dfc90d87c239bf00df1d81a57c695ca3" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.801558 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-krdfx"] Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.806463 4807 scope.go:117] "RemoveContainer" containerID="4928cd86a492b5f4aa808ab81056fb50f222addc7d6877de3dd175811dffeee7" Dec 02 20:25:25 crc kubenswrapper[4807]: I1202 20:25:25.816012 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-krdfx"] Dec 02 20:25:26 crc kubenswrapper[4807]: I1202 20:25:26.985701 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e07593fd-c413-4ddb-9107-99c3108451ed" path="/var/lib/kubelet/pods/e07593fd-c413-4ddb-9107-99c3108451ed/volumes" Dec 02 20:25:28 crc kubenswrapper[4807]: I1202 20:25:28.474004 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:28 crc kubenswrapper[4807]: I1202 20:25:28.474076 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:29 crc kubenswrapper[4807]: I1202 20:25:29.540387 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zmrhz" podUID="5b2af9b5-15ce-4b6c-b43c-a0d2035dc618" containerName="registry-server" probeResult="failure" output=< Dec 02 20:25:29 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Dec 02 20:25:29 crc kubenswrapper[4807]: > Dec 02 20:25:31 crc kubenswrapper[4807]: I1202 20:25:31.973238 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:25:31 crc kubenswrapper[4807]: E1202 20:25:31.974145 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:25:35 crc kubenswrapper[4807]: I1202 20:25:35.964425 4807 generic.go:334] "Generic (PLEG): container finished" podID="8593e062-85d8-4f22-88b4-eb7cf5654859" containerID="c550845b76b5bed6609c55fffa998b03d5ed227445b6c9d24cf0d59b28e7d97c" exitCode=0 Dec 02 20:25:35 crc kubenswrapper[4807]: I1202 20:25:35.964519 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8593e062-85d8-4f22-88b4-eb7cf5654859","Type":"ContainerDied","Data":"c550845b76b5bed6609c55fffa998b03d5ed227445b6c9d24cf0d59b28e7d97c"} Dec 02 20:25:35 crc kubenswrapper[4807]: I1202 20:25:35.969139 4807 generic.go:334] "Generic (PLEG): container finished" podID="748ead81-bff5-4a69-9398-4e3c91be5979" containerID="114400a1b22b38d42d8258af29c7c666194962443fba1f1ea566d7eaa8ec7dde" exitCode=0 Dec 02 20:25:35 crc kubenswrapper[4807]: I1202 20:25:35.969204 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"748ead81-bff5-4a69-9398-4e3c91be5979","Type":"ContainerDied","Data":"114400a1b22b38d42d8258af29c7c666194962443fba1f1ea566d7eaa8ec7dde"} Dec 02 20:25:36 crc kubenswrapper[4807]: I1202 20:25:36.992752 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8593e062-85d8-4f22-88b4-eb7cf5654859","Type":"ContainerStarted","Data":"e442a4dfe3899a64c63684fa605612d3fb0c65703513e584aeefa97555f3231c"} Dec 02 20:25:36 crc kubenswrapper[4807]: I1202 20:25:36.994049 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 20:25:37 crc kubenswrapper[4807]: I1202 20:25:37.005110 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"748ead81-bff5-4a69-9398-4e3c91be5979","Type":"ContainerStarted","Data":"a6d7318a6ad5dfcaa380caa02166c8bb50e216b9664ef23798b58d61cba0db92"} Dec 02 20:25:37 crc kubenswrapper[4807]: I1202 20:25:37.006129 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:37 crc kubenswrapper[4807]: I1202 20:25:37.046163 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.04612546 podStartE2EDuration="38.04612546s" podCreationTimestamp="2025-12-02 20:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:25:37.029357786 +0000 UTC m=+1672.330265281" watchObservedRunningTime="2025-12-02 20:25:37.04612546 +0000 UTC m=+1672.347032955" Dec 02 20:25:37 crc kubenswrapper[4807]: I1202 20:25:37.070787 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.070769282 podStartE2EDuration="38.070769282s" podCreationTimestamp="2025-12-02 20:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:25:37.063273326 +0000 UTC m=+1672.364180821" watchObservedRunningTime="2025-12-02 20:25:37.070769282 +0000 UTC m=+1672.371676777" Dec 02 20:25:38 crc kubenswrapper[4807]: I1202 20:25:38.541654 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:38 crc kubenswrapper[4807]: I1202 20:25:38.611765 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:38 crc kubenswrapper[4807]: I1202 20:25:38.795317 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zmrhz"] Dec 02 20:25:40 crc kubenswrapper[4807]: I1202 20:25:40.036921 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zmrhz" podUID="5b2af9b5-15ce-4b6c-b43c-a0d2035dc618" containerName="registry-server" containerID="cri-o://c79c1a1f16efe54a4565262360773b2101d3c14d5acc310eacfd6d078587a7eb" gracePeriod=2 Dec 02 20:25:40 crc kubenswrapper[4807]: I1202 20:25:40.890044 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:40 crc kubenswrapper[4807]: I1202 20:25:40.922821 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-utilities\") pod \"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618\" (UID: \"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618\") " Dec 02 20:25:40 crc kubenswrapper[4807]: I1202 20:25:40.923058 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtxxb\" (UniqueName: \"kubernetes.io/projected/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-kube-api-access-rtxxb\") pod \"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618\" (UID: \"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618\") " Dec 02 20:25:40 crc kubenswrapper[4807]: I1202 20:25:40.923136 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-catalog-content\") pod \"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618\" (UID: \"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618\") " Dec 02 20:25:40 crc kubenswrapper[4807]: I1202 20:25:40.924705 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-utilities" (OuterVolumeSpecName: "utilities") pod "5b2af9b5-15ce-4b6c-b43c-a0d2035dc618" (UID: "5b2af9b5-15ce-4b6c-b43c-a0d2035dc618"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:25:40 crc kubenswrapper[4807]: I1202 20:25:40.941939 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-kube-api-access-rtxxb" (OuterVolumeSpecName: "kube-api-access-rtxxb") pod "5b2af9b5-15ce-4b6c-b43c-a0d2035dc618" (UID: "5b2af9b5-15ce-4b6c-b43c-a0d2035dc618"). InnerVolumeSpecName "kube-api-access-rtxxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.027751 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtxxb\" (UniqueName: \"kubernetes.io/projected/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-kube-api-access-rtxxb\") on node \"crc\" DevicePath \"\"" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.027795 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.050012 4807 generic.go:334] "Generic (PLEG): container finished" podID="5b2af9b5-15ce-4b6c-b43c-a0d2035dc618" containerID="c79c1a1f16efe54a4565262360773b2101d3c14d5acc310eacfd6d078587a7eb" exitCode=0 Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.050070 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmrhz" event={"ID":"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618","Type":"ContainerDied","Data":"c79c1a1f16efe54a4565262360773b2101d3c14d5acc310eacfd6d078587a7eb"} Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.050109 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zmrhz" event={"ID":"5b2af9b5-15ce-4b6c-b43c-a0d2035dc618","Type":"ContainerDied","Data":"4ce3ae50826ee3f9d4be93ab561f06dea9e76d1c3d11ff7771b8f73245bb57c5"} Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.050130 4807 scope.go:117] "RemoveContainer" containerID="c79c1a1f16efe54a4565262360773b2101d3c14d5acc310eacfd6d078587a7eb" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.050301 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zmrhz" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.075870 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b2af9b5-15ce-4b6c-b43c-a0d2035dc618" (UID: "5b2af9b5-15ce-4b6c-b43c-a0d2035dc618"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.079222 4807 scope.go:117] "RemoveContainer" containerID="9ca05e747a2c826f8646fa7c09d1bf52981298580c80896c4a7d39dd2ad18a2c" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.111205 4807 scope.go:117] "RemoveContainer" containerID="67bd3a34f15865fb893a7d2973e8b76a534adde459e04a7bc5006a6a9f5a97d0" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.129753 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.148894 4807 scope.go:117] "RemoveContainer" containerID="c79c1a1f16efe54a4565262360773b2101d3c14d5acc310eacfd6d078587a7eb" Dec 02 20:25:41 crc kubenswrapper[4807]: E1202 20:25:41.149564 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c79c1a1f16efe54a4565262360773b2101d3c14d5acc310eacfd6d078587a7eb\": container with ID starting with c79c1a1f16efe54a4565262360773b2101d3c14d5acc310eacfd6d078587a7eb not found: ID does not exist" containerID="c79c1a1f16efe54a4565262360773b2101d3c14d5acc310eacfd6d078587a7eb" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.149620 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c79c1a1f16efe54a4565262360773b2101d3c14d5acc310eacfd6d078587a7eb"} err="failed to get container status \"c79c1a1f16efe54a4565262360773b2101d3c14d5acc310eacfd6d078587a7eb\": rpc error: code = NotFound desc = could not find container \"c79c1a1f16efe54a4565262360773b2101d3c14d5acc310eacfd6d078587a7eb\": container with ID starting with c79c1a1f16efe54a4565262360773b2101d3c14d5acc310eacfd6d078587a7eb not found: ID does not exist" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.149656 4807 scope.go:117] "RemoveContainer" containerID="9ca05e747a2c826f8646fa7c09d1bf52981298580c80896c4a7d39dd2ad18a2c" Dec 02 20:25:41 crc kubenswrapper[4807]: E1202 20:25:41.150253 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ca05e747a2c826f8646fa7c09d1bf52981298580c80896c4a7d39dd2ad18a2c\": container with ID starting with 9ca05e747a2c826f8646fa7c09d1bf52981298580c80896c4a7d39dd2ad18a2c not found: ID does not exist" containerID="9ca05e747a2c826f8646fa7c09d1bf52981298580c80896c4a7d39dd2ad18a2c" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.150296 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ca05e747a2c826f8646fa7c09d1bf52981298580c80896c4a7d39dd2ad18a2c"} err="failed to get container status \"9ca05e747a2c826f8646fa7c09d1bf52981298580c80896c4a7d39dd2ad18a2c\": rpc error: code = NotFound desc = could not find container \"9ca05e747a2c826f8646fa7c09d1bf52981298580c80896c4a7d39dd2ad18a2c\": container with ID starting with 9ca05e747a2c826f8646fa7c09d1bf52981298580c80896c4a7d39dd2ad18a2c not found: ID does not exist" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.150321 4807 scope.go:117] "RemoveContainer" containerID="67bd3a34f15865fb893a7d2973e8b76a534adde459e04a7bc5006a6a9f5a97d0" Dec 02 20:25:41 crc kubenswrapper[4807]: E1202 20:25:41.151135 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67bd3a34f15865fb893a7d2973e8b76a534adde459e04a7bc5006a6a9f5a97d0\": container with ID starting with 67bd3a34f15865fb893a7d2973e8b76a534adde459e04a7bc5006a6a9f5a97d0 not found: ID does not exist" containerID="67bd3a34f15865fb893a7d2973e8b76a534adde459e04a7bc5006a6a9f5a97d0" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.151167 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67bd3a34f15865fb893a7d2973e8b76a534adde459e04a7bc5006a6a9f5a97d0"} err="failed to get container status \"67bd3a34f15865fb893a7d2973e8b76a534adde459e04a7bc5006a6a9f5a97d0\": rpc error: code = NotFound desc = could not find container \"67bd3a34f15865fb893a7d2973e8b76a534adde459e04a7bc5006a6a9f5a97d0\": container with ID starting with 67bd3a34f15865fb893a7d2973e8b76a534adde459e04a7bc5006a6a9f5a97d0 not found: ID does not exist" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.386606 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zmrhz"] Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.396363 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zmrhz"] Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.733211 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk"] Dec 02 20:25:41 crc kubenswrapper[4807]: E1202 20:25:41.733880 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b2af9b5-15ce-4b6c-b43c-a0d2035dc618" containerName="registry-server" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.733903 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2af9b5-15ce-4b6c-b43c-a0d2035dc618" containerName="registry-server" Dec 02 20:25:41 crc kubenswrapper[4807]: E1202 20:25:41.733921 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b2af9b5-15ce-4b6c-b43c-a0d2035dc618" containerName="extract-content" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.733930 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2af9b5-15ce-4b6c-b43c-a0d2035dc618" containerName="extract-content" Dec 02 20:25:41 crc kubenswrapper[4807]: E1202 20:25:41.734021 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e07593fd-c413-4ddb-9107-99c3108451ed" containerName="init" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.734034 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07593fd-c413-4ddb-9107-99c3108451ed" containerName="init" Dec 02 20:25:41 crc kubenswrapper[4807]: E1202 20:25:41.734056 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b2af9b5-15ce-4b6c-b43c-a0d2035dc618" containerName="extract-utilities" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.734066 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2af9b5-15ce-4b6c-b43c-a0d2035dc618" containerName="extract-utilities" Dec 02 20:25:41 crc kubenswrapper[4807]: E1202 20:25:41.734095 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e07593fd-c413-4ddb-9107-99c3108451ed" containerName="dnsmasq-dns" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.734104 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07593fd-c413-4ddb-9107-99c3108451ed" containerName="dnsmasq-dns" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.734377 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="e07593fd-c413-4ddb-9107-99c3108451ed" containerName="dnsmasq-dns" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.734400 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b2af9b5-15ce-4b6c-b43c-a0d2035dc618" containerName="registry-server" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.735336 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.742197 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.744060 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.744471 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.751432 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.767780 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk"] Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.844891 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdkgj\" (UniqueName: \"kubernetes.io/projected/8483568b-e11d-4aea-8fcb-1925d2e64fa2-kube-api-access-rdkgj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk\" (UID: \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.844987 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk\" (UID: \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.845039 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk\" (UID: \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.845072 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk\" (UID: \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.947449 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk\" (UID: \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.947617 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdkgj\" (UniqueName: \"kubernetes.io/projected/8483568b-e11d-4aea-8fcb-1925d2e64fa2-kube-api-access-rdkgj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk\" (UID: \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.947687 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk\" (UID: \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.947753 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk\" (UID: \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.952974 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk\" (UID: \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.953237 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk\" (UID: \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.953298 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk\" (UID: \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" Dec 02 20:25:41 crc kubenswrapper[4807]: I1202 20:25:41.967492 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdkgj\" (UniqueName: \"kubernetes.io/projected/8483568b-e11d-4aea-8fcb-1925d2e64fa2-kube-api-access-rdkgj\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk\" (UID: \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" Dec 02 20:25:42 crc kubenswrapper[4807]: I1202 20:25:42.072698 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" Dec 02 20:25:42 crc kubenswrapper[4807]: I1202 20:25:42.681507 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk"] Dec 02 20:25:42 crc kubenswrapper[4807]: W1202 20:25:42.685938 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8483568b_e11d_4aea_8fcb_1925d2e64fa2.slice/crio-f223b395fcf5e39d52018995d6999a1df04c801bfcdb1c55c8b60086b9c297f0 WatchSource:0}: Error finding container f223b395fcf5e39d52018995d6999a1df04c801bfcdb1c55c8b60086b9c297f0: Status 404 returned error can't find the container with id f223b395fcf5e39d52018995d6999a1df04c801bfcdb1c55c8b60086b9c297f0 Dec 02 20:25:42 crc kubenswrapper[4807]: I1202 20:25:42.996047 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b2af9b5-15ce-4b6c-b43c-a0d2035dc618" path="/var/lib/kubelet/pods/5b2af9b5-15ce-4b6c-b43c-a0d2035dc618/volumes" Dec 02 20:25:43 crc kubenswrapper[4807]: I1202 20:25:43.094756 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" event={"ID":"8483568b-e11d-4aea-8fcb-1925d2e64fa2","Type":"ContainerStarted","Data":"f223b395fcf5e39d52018995d6999a1df04c801bfcdb1c55c8b60086b9c297f0"} Dec 02 20:25:46 crc kubenswrapper[4807]: I1202 20:25:46.973106 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:25:46 crc kubenswrapper[4807]: E1202 20:25:46.974096 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:25:49 crc kubenswrapper[4807]: I1202 20:25:49.742932 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 20:25:50 crc kubenswrapper[4807]: I1202 20:25:50.568931 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 20:25:54 crc kubenswrapper[4807]: I1202 20:25:54.782054 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:25:55 crc kubenswrapper[4807]: I1202 20:25:55.278526 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" event={"ID":"8483568b-e11d-4aea-8fcb-1925d2e64fa2","Type":"ContainerStarted","Data":"624bfc5846daf8b164a52537e63e94d1e15d99cbb0b46b4c9c659e0785668c7b"} Dec 02 20:25:55 crc kubenswrapper[4807]: I1202 20:25:55.325470 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" podStartSLOduration=2.236583489 podStartE2EDuration="14.325431764s" podCreationTimestamp="2025-12-02 20:25:41 +0000 UTC" firstStartedPulling="2025-12-02 20:25:42.690098793 +0000 UTC m=+1677.991006288" lastFinishedPulling="2025-12-02 20:25:54.778947068 +0000 UTC m=+1690.079854563" observedRunningTime="2025-12-02 20:25:55.317880176 +0000 UTC m=+1690.618787681" watchObservedRunningTime="2025-12-02 20:25:55.325431764 +0000 UTC m=+1690.626339249" Dec 02 20:26:00 crc kubenswrapper[4807]: I1202 20:26:00.977656 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:26:00 crc kubenswrapper[4807]: E1202 20:26:00.980407 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:26:07 crc kubenswrapper[4807]: I1202 20:26:07.428668 4807 generic.go:334] "Generic (PLEG): container finished" podID="8483568b-e11d-4aea-8fcb-1925d2e64fa2" containerID="624bfc5846daf8b164a52537e63e94d1e15d99cbb0b46b4c9c659e0785668c7b" exitCode=0 Dec 02 20:26:07 crc kubenswrapper[4807]: I1202 20:26:07.428880 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" event={"ID":"8483568b-e11d-4aea-8fcb-1925d2e64fa2","Type":"ContainerDied","Data":"624bfc5846daf8b164a52537e63e94d1e15d99cbb0b46b4c9c659e0785668c7b"} Dec 02 20:26:07 crc kubenswrapper[4807]: I1202 20:26:07.931430 4807 scope.go:117] "RemoveContainer" containerID="783ee09a7ea28f99781fd2a5396867d821685cbfa7907f2db86f0446e40d3775" Dec 02 20:26:08 crc kubenswrapper[4807]: I1202 20:26:08.930181 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.044098 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-ssh-key\") pod \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\" (UID: \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\") " Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.044269 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-inventory\") pod \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\" (UID: \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\") " Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.044372 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-repo-setup-combined-ca-bundle\") pod \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\" (UID: \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\") " Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.044531 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdkgj\" (UniqueName: \"kubernetes.io/projected/8483568b-e11d-4aea-8fcb-1925d2e64fa2-kube-api-access-rdkgj\") pod \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\" (UID: \"8483568b-e11d-4aea-8fcb-1925d2e64fa2\") " Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.053735 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8483568b-e11d-4aea-8fcb-1925d2e64fa2" (UID: "8483568b-e11d-4aea-8fcb-1925d2e64fa2"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.057908 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8483568b-e11d-4aea-8fcb-1925d2e64fa2-kube-api-access-rdkgj" (OuterVolumeSpecName: "kube-api-access-rdkgj") pod "8483568b-e11d-4aea-8fcb-1925d2e64fa2" (UID: "8483568b-e11d-4aea-8fcb-1925d2e64fa2"). InnerVolumeSpecName "kube-api-access-rdkgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.092359 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-inventory" (OuterVolumeSpecName: "inventory") pod "8483568b-e11d-4aea-8fcb-1925d2e64fa2" (UID: "8483568b-e11d-4aea-8fcb-1925d2e64fa2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.095698 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8483568b-e11d-4aea-8fcb-1925d2e64fa2" (UID: "8483568b-e11d-4aea-8fcb-1925d2e64fa2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.147949 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdkgj\" (UniqueName: \"kubernetes.io/projected/8483568b-e11d-4aea-8fcb-1925d2e64fa2-kube-api-access-rdkgj\") on node \"crc\" DevicePath \"\"" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.148005 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.148020 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.148043 4807 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8483568b-e11d-4aea-8fcb-1925d2e64fa2-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.464763 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" event={"ID":"8483568b-e11d-4aea-8fcb-1925d2e64fa2","Type":"ContainerDied","Data":"f223b395fcf5e39d52018995d6999a1df04c801bfcdb1c55c8b60086b9c297f0"} Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.464822 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f223b395fcf5e39d52018995d6999a1df04c801bfcdb1c55c8b60086b9c297f0" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.464901 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.577174 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9"] Dec 02 20:26:09 crc kubenswrapper[4807]: E1202 20:26:09.578413 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8483568b-e11d-4aea-8fcb-1925d2e64fa2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.578455 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="8483568b-e11d-4aea-8fcb-1925d2e64fa2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.578864 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="8483568b-e11d-4aea-8fcb-1925d2e64fa2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.580182 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.582278 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.582932 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.585038 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.585264 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.608900 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9"] Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.760151 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2548dec0-ad57-411d-891a-0b847b25a4bb-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89mf9\" (UID: \"2548dec0-ad57-411d-891a-0b847b25a4bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.760204 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z6s7\" (UniqueName: \"kubernetes.io/projected/2548dec0-ad57-411d-891a-0b847b25a4bb-kube-api-access-5z6s7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89mf9\" (UID: \"2548dec0-ad57-411d-891a-0b847b25a4bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.760405 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2548dec0-ad57-411d-891a-0b847b25a4bb-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89mf9\" (UID: \"2548dec0-ad57-411d-891a-0b847b25a4bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.863619 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2548dec0-ad57-411d-891a-0b847b25a4bb-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89mf9\" (UID: \"2548dec0-ad57-411d-891a-0b847b25a4bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.863747 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2548dec0-ad57-411d-891a-0b847b25a4bb-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89mf9\" (UID: \"2548dec0-ad57-411d-891a-0b847b25a4bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.863828 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z6s7\" (UniqueName: \"kubernetes.io/projected/2548dec0-ad57-411d-891a-0b847b25a4bb-kube-api-access-5z6s7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89mf9\" (UID: \"2548dec0-ad57-411d-891a-0b847b25a4bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.871533 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2548dec0-ad57-411d-891a-0b847b25a4bb-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89mf9\" (UID: \"2548dec0-ad57-411d-891a-0b847b25a4bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.871985 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2548dec0-ad57-411d-891a-0b847b25a4bb-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89mf9\" (UID: \"2548dec0-ad57-411d-891a-0b847b25a4bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.882341 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z6s7\" (UniqueName: \"kubernetes.io/projected/2548dec0-ad57-411d-891a-0b847b25a4bb-kube-api-access-5z6s7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-89mf9\" (UID: \"2548dec0-ad57-411d-891a-0b847b25a4bb\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" Dec 02 20:26:09 crc kubenswrapper[4807]: I1202 20:26:09.906827 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" Dec 02 20:26:10 crc kubenswrapper[4807]: I1202 20:26:10.517229 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9"] Dec 02 20:26:10 crc kubenswrapper[4807]: I1202 20:26:10.537443 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 20:26:11 crc kubenswrapper[4807]: I1202 20:26:11.492634 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" event={"ID":"2548dec0-ad57-411d-891a-0b847b25a4bb","Type":"ContainerStarted","Data":"2f2c374ec8a8408485ef859cdf60514ee256a993665346a18738f6a1e100af34"} Dec 02 20:26:11 crc kubenswrapper[4807]: I1202 20:26:11.493501 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" event={"ID":"2548dec0-ad57-411d-891a-0b847b25a4bb","Type":"ContainerStarted","Data":"776e73de17e841ad130cdf288d8558faa881acf4e3d9d5b3a32a8fc236fd70f3"} Dec 02 20:26:11 crc kubenswrapper[4807]: I1202 20:26:11.534857 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" podStartSLOduration=1.993608129 podStartE2EDuration="2.534823035s" podCreationTimestamp="2025-12-02 20:26:09 +0000 UTC" firstStartedPulling="2025-12-02 20:26:10.537106529 +0000 UTC m=+1705.838014034" lastFinishedPulling="2025-12-02 20:26:11.078321445 +0000 UTC m=+1706.379228940" observedRunningTime="2025-12-02 20:26:11.521067201 +0000 UTC m=+1706.821974706" watchObservedRunningTime="2025-12-02 20:26:11.534823035 +0000 UTC m=+1706.835730540" Dec 02 20:26:12 crc kubenswrapper[4807]: I1202 20:26:12.972560 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:26:12 crc kubenswrapper[4807]: E1202 20:26:12.973378 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:26:14 crc kubenswrapper[4807]: I1202 20:26:14.615504 4807 generic.go:334] "Generic (PLEG): container finished" podID="2548dec0-ad57-411d-891a-0b847b25a4bb" containerID="2f2c374ec8a8408485ef859cdf60514ee256a993665346a18738f6a1e100af34" exitCode=0 Dec 02 20:26:14 crc kubenswrapper[4807]: I1202 20:26:14.615615 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" event={"ID":"2548dec0-ad57-411d-891a-0b847b25a4bb","Type":"ContainerDied","Data":"2f2c374ec8a8408485ef859cdf60514ee256a993665346a18738f6a1e100af34"} Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.110298 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.307635 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2548dec0-ad57-411d-891a-0b847b25a4bb-inventory\") pod \"2548dec0-ad57-411d-891a-0b847b25a4bb\" (UID: \"2548dec0-ad57-411d-891a-0b847b25a4bb\") " Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.307826 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2548dec0-ad57-411d-891a-0b847b25a4bb-ssh-key\") pod \"2548dec0-ad57-411d-891a-0b847b25a4bb\" (UID: \"2548dec0-ad57-411d-891a-0b847b25a4bb\") " Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.308128 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z6s7\" (UniqueName: \"kubernetes.io/projected/2548dec0-ad57-411d-891a-0b847b25a4bb-kube-api-access-5z6s7\") pod \"2548dec0-ad57-411d-891a-0b847b25a4bb\" (UID: \"2548dec0-ad57-411d-891a-0b847b25a4bb\") " Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.314272 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2548dec0-ad57-411d-891a-0b847b25a4bb-kube-api-access-5z6s7" (OuterVolumeSpecName: "kube-api-access-5z6s7") pod "2548dec0-ad57-411d-891a-0b847b25a4bb" (UID: "2548dec0-ad57-411d-891a-0b847b25a4bb"). InnerVolumeSpecName "kube-api-access-5z6s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.340807 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2548dec0-ad57-411d-891a-0b847b25a4bb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2548dec0-ad57-411d-891a-0b847b25a4bb" (UID: "2548dec0-ad57-411d-891a-0b847b25a4bb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.351532 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2548dec0-ad57-411d-891a-0b847b25a4bb-inventory" (OuterVolumeSpecName: "inventory") pod "2548dec0-ad57-411d-891a-0b847b25a4bb" (UID: "2548dec0-ad57-411d-891a-0b847b25a4bb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.411577 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2548dec0-ad57-411d-891a-0b847b25a4bb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.411620 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z6s7\" (UniqueName: \"kubernetes.io/projected/2548dec0-ad57-411d-891a-0b847b25a4bb-kube-api-access-5z6s7\") on node \"crc\" DevicePath \"\"" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.411635 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2548dec0-ad57-411d-891a-0b847b25a4bb-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.639308 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" event={"ID":"2548dec0-ad57-411d-891a-0b847b25a4bb","Type":"ContainerDied","Data":"776e73de17e841ad130cdf288d8558faa881acf4e3d9d5b3a32a8fc236fd70f3"} Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.639368 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="776e73de17e841ad130cdf288d8558faa881acf4e3d9d5b3a32a8fc236fd70f3" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.639447 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-89mf9" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.743199 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28"] Dec 02 20:26:16 crc kubenswrapper[4807]: E1202 20:26:16.744249 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2548dec0-ad57-411d-891a-0b847b25a4bb" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.744281 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2548dec0-ad57-411d-891a-0b847b25a4bb" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.744590 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="2548dec0-ad57-411d-891a-0b847b25a4bb" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.745877 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.749158 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.751480 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.751530 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.751623 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.753461 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28"] Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.819195 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpgms\" (UniqueName: \"kubernetes.io/projected/59148eea-351d-4d4c-ba60-e39e47372466-kube-api-access-lpgms\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-74b28\" (UID: \"59148eea-351d-4d4c-ba60-e39e47372466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.819337 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-74b28\" (UID: \"59148eea-351d-4d4c-ba60-e39e47372466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.819375 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-74b28\" (UID: \"59148eea-351d-4d4c-ba60-e39e47372466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.819493 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-74b28\" (UID: \"59148eea-351d-4d4c-ba60-e39e47372466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.921203 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-74b28\" (UID: \"59148eea-351d-4d4c-ba60-e39e47372466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.921320 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpgms\" (UniqueName: \"kubernetes.io/projected/59148eea-351d-4d4c-ba60-e39e47372466-kube-api-access-lpgms\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-74b28\" (UID: \"59148eea-351d-4d4c-ba60-e39e47372466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.921377 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-74b28\" (UID: \"59148eea-351d-4d4c-ba60-e39e47372466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.921396 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-74b28\" (UID: \"59148eea-351d-4d4c-ba60-e39e47372466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.926334 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-74b28\" (UID: \"59148eea-351d-4d4c-ba60-e39e47372466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.926796 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-74b28\" (UID: \"59148eea-351d-4d4c-ba60-e39e47372466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.927392 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-74b28\" (UID: \"59148eea-351d-4d4c-ba60-e39e47372466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" Dec 02 20:26:16 crc kubenswrapper[4807]: I1202 20:26:16.940782 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpgms\" (UniqueName: \"kubernetes.io/projected/59148eea-351d-4d4c-ba60-e39e47372466-kube-api-access-lpgms\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-74b28\" (UID: \"59148eea-351d-4d4c-ba60-e39e47372466\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" Dec 02 20:26:17 crc kubenswrapper[4807]: I1202 20:26:17.080877 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" Dec 02 20:26:17 crc kubenswrapper[4807]: I1202 20:26:17.668200 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28"] Dec 02 20:26:18 crc kubenswrapper[4807]: I1202 20:26:18.674950 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" event={"ID":"59148eea-351d-4d4c-ba60-e39e47372466","Type":"ContainerStarted","Data":"cc2464f467d1f2bbec0be74790fe223171013ae4b91e266b7cac2e792b46d740"} Dec 02 20:26:18 crc kubenswrapper[4807]: I1202 20:26:18.675835 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" event={"ID":"59148eea-351d-4d4c-ba60-e39e47372466","Type":"ContainerStarted","Data":"e373b7f5d4ddc9ab18d4efc78d08733d7daa278d7465a6de1cba2606488cfb95"} Dec 02 20:26:18 crc kubenswrapper[4807]: I1202 20:26:18.706821 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" podStartSLOduration=2.300254366 podStartE2EDuration="2.706792953s" podCreationTimestamp="2025-12-02 20:26:16 +0000 UTC" firstStartedPulling="2025-12-02 20:26:17.678678283 +0000 UTC m=+1712.979585798" lastFinishedPulling="2025-12-02 20:26:18.0852169 +0000 UTC m=+1713.386124385" observedRunningTime="2025-12-02 20:26:18.696343119 +0000 UTC m=+1713.997250684" watchObservedRunningTime="2025-12-02 20:26:18.706792953 +0000 UTC m=+1714.007700448" Dec 02 20:26:23 crc kubenswrapper[4807]: I1202 20:26:23.973528 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:26:23 crc kubenswrapper[4807]: E1202 20:26:23.974664 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:26:34 crc kubenswrapper[4807]: I1202 20:26:34.981867 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:26:34 crc kubenswrapper[4807]: E1202 20:26:34.985420 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:26:49 crc kubenswrapper[4807]: I1202 20:26:49.973442 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:26:49 crc kubenswrapper[4807]: E1202 20:26:49.974799 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:27:04 crc kubenswrapper[4807]: I1202 20:27:04.981103 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:27:04 crc kubenswrapper[4807]: E1202 20:27:04.982318 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:27:08 crc kubenswrapper[4807]: I1202 20:27:08.076442 4807 scope.go:117] "RemoveContainer" containerID="af444d5027b97215d0398e19787af2420a5406dd8f3f33a00bd64ff2801c5918" Dec 02 20:27:08 crc kubenswrapper[4807]: I1202 20:27:08.118246 4807 scope.go:117] "RemoveContainer" containerID="3b720548325b56bbd73e3a091b36150cb610105cc5099286cdec944dcb1bb854" Dec 02 20:27:08 crc kubenswrapper[4807]: I1202 20:27:08.145619 4807 scope.go:117] "RemoveContainer" containerID="a1234d2f9bbea41e51f17c78fe72a9d053c485e8d7dd5e87d80b7a256d014711" Dec 02 20:27:08 crc kubenswrapper[4807]: I1202 20:27:08.171651 4807 scope.go:117] "RemoveContainer" containerID="c3c0926311cd59fead8aa752b834bf050f2e0d30673563e3d2b353c21c7fab3e" Dec 02 20:27:08 crc kubenswrapper[4807]: I1202 20:27:08.212084 4807 scope.go:117] "RemoveContainer" containerID="3e89b0a0ee43f72620258f6751cc07b2cceb26eb9b6f280d686619faa8a95a3c" Dec 02 20:27:08 crc kubenswrapper[4807]: I1202 20:27:08.248253 4807 scope.go:117] "RemoveContainer" containerID="3b918f0ec4719dbf7628ce7a7d4f4f05823795a4512433c7ef6fbe8f2c53e1ab" Dec 02 20:27:08 crc kubenswrapper[4807]: I1202 20:27:08.285874 4807 scope.go:117] "RemoveContainer" containerID="4742c9b21a7f9c5e7eae76e27daec65c71239088aaa0cd26d31987b735ceb989" Dec 02 20:27:19 crc kubenswrapper[4807]: I1202 20:27:19.973207 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:27:19 crc kubenswrapper[4807]: E1202 20:27:19.974380 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:27:34 crc kubenswrapper[4807]: I1202 20:27:34.984146 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:27:34 crc kubenswrapper[4807]: E1202 20:27:34.985300 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:27:46 crc kubenswrapper[4807]: I1202 20:27:46.974037 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:27:46 crc kubenswrapper[4807]: E1202 20:27:46.975253 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:27:58 crc kubenswrapper[4807]: I1202 20:27:58.973565 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:27:58 crc kubenswrapper[4807]: E1202 20:27:58.975863 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:28:08 crc kubenswrapper[4807]: I1202 20:28:08.451514 4807 scope.go:117] "RemoveContainer" containerID="5dfebc0bed7f33a08fa20076c3d85f44fe7d3e88742feff016e12ba029d718c6" Dec 02 20:28:08 crc kubenswrapper[4807]: I1202 20:28:08.490072 4807 scope.go:117] "RemoveContainer" containerID="09e4f83dac6491d97d3f260dec631bdf13655deccee4f8d59eb829f923b744ae" Dec 02 20:28:08 crc kubenswrapper[4807]: I1202 20:28:08.529385 4807 scope.go:117] "RemoveContainer" containerID="342402e56084c324d14fbaabed363fa062033c990511ad4a68bcbaf840f6cfb5" Dec 02 20:28:08 crc kubenswrapper[4807]: I1202 20:28:08.564317 4807 scope.go:117] "RemoveContainer" containerID="45e64ab22e737f56f56b778637c988bbd807c4d6ee2d5839fc5c5c7fb5190739" Dec 02 20:28:08 crc kubenswrapper[4807]: I1202 20:28:08.591157 4807 scope.go:117] "RemoveContainer" containerID="ab302bb0fb63c84c0270fba97fd710bdfd6b0e56b5a534f17d053da947f84866" Dec 02 20:28:08 crc kubenswrapper[4807]: I1202 20:28:08.618678 4807 scope.go:117] "RemoveContainer" containerID="01e2c4637f6823cccb3acfcb3e34949a409fd95e636c46c42398c7849ab82587" Dec 02 20:28:08 crc kubenswrapper[4807]: I1202 20:28:08.654783 4807 scope.go:117] "RemoveContainer" containerID="47732380b94d7a27cfc907079dae39ca54e698ec0f075a743a6b883c19a2ea27" Dec 02 20:28:08 crc kubenswrapper[4807]: I1202 20:28:08.683173 4807 scope.go:117] "RemoveContainer" containerID="c9624798dede7c456625a9d09b4525f07b4df33ab18d8bc416cb442386992a8d" Dec 02 20:28:08 crc kubenswrapper[4807]: I1202 20:28:08.709805 4807 scope.go:117] "RemoveContainer" containerID="208d13485cde16a92ad340b86623b207277c84223e68018f32ff78c1e017584c" Dec 02 20:28:11 crc kubenswrapper[4807]: I1202 20:28:11.972456 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:28:11 crc kubenswrapper[4807]: E1202 20:28:11.973459 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:28:26 crc kubenswrapper[4807]: I1202 20:28:26.973145 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:28:26 crc kubenswrapper[4807]: E1202 20:28:26.974207 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:28:41 crc kubenswrapper[4807]: I1202 20:28:41.973819 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:28:41 crc kubenswrapper[4807]: E1202 20:28:41.975256 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:28:56 crc kubenswrapper[4807]: I1202 20:28:56.973272 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:28:56 crc kubenswrapper[4807]: E1202 20:28:56.974948 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:29:08 crc kubenswrapper[4807]: I1202 20:29:08.803312 4807 scope.go:117] "RemoveContainer" containerID="f0fc967027c35c9af4d717f0647556452e3f07dadb2522d9edd32e6232dd7505" Dec 02 20:29:08 crc kubenswrapper[4807]: I1202 20:29:08.834024 4807 scope.go:117] "RemoveContainer" containerID="f61970ea9d319d6dfdbde4f05220c8f50a611b1027f70751b2a41f910279e245" Dec 02 20:29:08 crc kubenswrapper[4807]: I1202 20:29:08.862618 4807 scope.go:117] "RemoveContainer" containerID="6d3f5926fca33558c6e2b96aeb30eae016be1857ecb1c3c07825deed455bad7c" Dec 02 20:29:08 crc kubenswrapper[4807]: I1202 20:29:08.885921 4807 scope.go:117] "RemoveContainer" containerID="0355a01dd916432a0f9b633efa09d905adadcb29d67e95267b540614d88aa257" Dec 02 20:29:08 crc kubenswrapper[4807]: I1202 20:29:08.916563 4807 scope.go:117] "RemoveContainer" containerID="e2e70bc569a7569c022be93e464015b05f77c409760e111d8464bc1a2513f1b1" Dec 02 20:29:08 crc kubenswrapper[4807]: I1202 20:29:08.939357 4807 scope.go:117] "RemoveContainer" containerID="88f09b99835ddf7ee29bc4a6b2c519c183c21b3c370aca7e13b229ebf50952a1" Dec 02 20:29:08 crc kubenswrapper[4807]: I1202 20:29:08.968210 4807 scope.go:117] "RemoveContainer" containerID="dc6133c01f5d78484890fd8acb046b9029b28806b82f86cd99234db1dbb5c1a6" Dec 02 20:29:08 crc kubenswrapper[4807]: I1202 20:29:08.973588 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:29:08 crc kubenswrapper[4807]: E1202 20:29:08.974292 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:29:08 crc kubenswrapper[4807]: I1202 20:29:08.998104 4807 scope.go:117] "RemoveContainer" containerID="a91b7221a9ef17496b2831a07e93cbc306113c6716495c912d3c40bdeb3437f5" Dec 02 20:29:09 crc kubenswrapper[4807]: I1202 20:29:09.029430 4807 scope.go:117] "RemoveContainer" containerID="0cbc4bad70b8310b16e320ddf5ec62e945616f5ced701471d648bb8da57dd93b" Dec 02 20:29:22 crc kubenswrapper[4807]: I1202 20:29:22.973150 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:29:22 crc kubenswrapper[4807]: E1202 20:29:22.974247 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:29:30 crc kubenswrapper[4807]: I1202 20:29:30.046514 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-2wbvt"] Dec 02 20:29:30 crc kubenswrapper[4807]: I1202 20:29:30.064694 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-2wbvt"] Dec 02 20:29:31 crc kubenswrapper[4807]: I1202 20:29:31.002895 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09c59d51-437b-44fc-a75a-d36757cbb08a" path="/var/lib/kubelet/pods/09c59d51-437b-44fc-a75a-d36757cbb08a/volumes" Dec 02 20:29:31 crc kubenswrapper[4807]: I1202 20:29:31.045090 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-x5fkh"] Dec 02 20:29:31 crc kubenswrapper[4807]: I1202 20:29:31.064630 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-2vb9g"] Dec 02 20:29:31 crc kubenswrapper[4807]: I1202 20:29:31.083352 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-46a0-account-create-update-6nrzt"] Dec 02 20:29:31 crc kubenswrapper[4807]: I1202 20:29:31.094468 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-x5fkh"] Dec 02 20:29:31 crc kubenswrapper[4807]: I1202 20:29:31.107253 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3e9b-account-create-update-6hr9w"] Dec 02 20:29:31 crc kubenswrapper[4807]: I1202 20:29:31.123999 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-23da-account-create-update-6mqn7"] Dec 02 20:29:31 crc kubenswrapper[4807]: I1202 20:29:31.136611 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-2vb9g"] Dec 02 20:29:31 crc kubenswrapper[4807]: I1202 20:29:31.146841 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-46a0-account-create-update-6nrzt"] Dec 02 20:29:31 crc kubenswrapper[4807]: I1202 20:29:31.156542 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3e9b-account-create-update-6hr9w"] Dec 02 20:29:31 crc kubenswrapper[4807]: I1202 20:29:31.164978 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-23da-account-create-update-6mqn7"] Dec 02 20:29:31 crc kubenswrapper[4807]: I1202 20:29:31.175205 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-54fc-account-create-update-c9zhp"] Dec 02 20:29:31 crc kubenswrapper[4807]: I1202 20:29:31.186517 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-54fc-account-create-update-c9zhp"] Dec 02 20:29:31 crc kubenswrapper[4807]: I1202 20:29:31.195877 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-6dkkn"] Dec 02 20:29:31 crc kubenswrapper[4807]: I1202 20:29:31.207662 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-6dkkn"] Dec 02 20:29:32 crc kubenswrapper[4807]: I1202 20:29:32.987582 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08b18fa4-ea02-46c3-86ea-10e33edde0c0" path="/var/lib/kubelet/pods/08b18fa4-ea02-46c3-86ea-10e33edde0c0/volumes" Dec 02 20:29:32 crc kubenswrapper[4807]: I1202 20:29:32.988475 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30197c83-6070-4ace-b56a-cf82f143ffb5" path="/var/lib/kubelet/pods/30197c83-6070-4ace-b56a-cf82f143ffb5/volumes" Dec 02 20:29:32 crc kubenswrapper[4807]: I1202 20:29:32.989363 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="390cf7e1-da04-4176-aa78-71446bc7cef4" path="/var/lib/kubelet/pods/390cf7e1-da04-4176-aa78-71446bc7cef4/volumes" Dec 02 20:29:32 crc kubenswrapper[4807]: I1202 20:29:32.990266 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a9bd526-4162-4ed4-8d18-f7dba71eec1f" path="/var/lib/kubelet/pods/3a9bd526-4162-4ed4-8d18-f7dba71eec1f/volumes" Dec 02 20:29:32 crc kubenswrapper[4807]: I1202 20:29:32.991767 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b4b20e6-d557-429a-b63f-8d51312805c9" path="/var/lib/kubelet/pods/5b4b20e6-d557-429a-b63f-8d51312805c9/volumes" Dec 02 20:29:32 crc kubenswrapper[4807]: I1202 20:29:32.992528 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b603338-b25a-435a-b8a8-32b3aa1791c8" path="/var/lib/kubelet/pods/6b603338-b25a-435a-b8a8-32b3aa1791c8/volumes" Dec 02 20:29:32 crc kubenswrapper[4807]: I1202 20:29:32.993487 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8927ec0-1565-476c-9625-462a7d198c4e" path="/var/lib/kubelet/pods/d8927ec0-1565-476c-9625-462a7d198c4e/volumes" Dec 02 20:29:35 crc kubenswrapper[4807]: I1202 20:29:35.973223 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:29:36 crc kubenswrapper[4807]: I1202 20:29:36.522825 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"083b747738a7b266cd838719e592adc712e0ead5027434ea8e6c8467bbbf14c4"} Dec 02 20:29:44 crc kubenswrapper[4807]: I1202 20:29:44.621530 4807 generic.go:334] "Generic (PLEG): container finished" podID="59148eea-351d-4d4c-ba60-e39e47372466" containerID="cc2464f467d1f2bbec0be74790fe223171013ae4b91e266b7cac2e792b46d740" exitCode=0 Dec 02 20:29:44 crc kubenswrapper[4807]: I1202 20:29:44.621638 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" event={"ID":"59148eea-351d-4d4c-ba60-e39e47372466","Type":"ContainerDied","Data":"cc2464f467d1f2bbec0be74790fe223171013ae4b91e266b7cac2e792b46d740"} Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.094178 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.242096 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-bootstrap-combined-ca-bundle\") pod \"59148eea-351d-4d4c-ba60-e39e47372466\" (UID: \"59148eea-351d-4d4c-ba60-e39e47372466\") " Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.242837 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-inventory\") pod \"59148eea-351d-4d4c-ba60-e39e47372466\" (UID: \"59148eea-351d-4d4c-ba60-e39e47372466\") " Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.242903 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-ssh-key\") pod \"59148eea-351d-4d4c-ba60-e39e47372466\" (UID: \"59148eea-351d-4d4c-ba60-e39e47372466\") " Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.243029 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpgms\" (UniqueName: \"kubernetes.io/projected/59148eea-351d-4d4c-ba60-e39e47372466-kube-api-access-lpgms\") pod \"59148eea-351d-4d4c-ba60-e39e47372466\" (UID: \"59148eea-351d-4d4c-ba60-e39e47372466\") " Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.250241 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59148eea-351d-4d4c-ba60-e39e47372466-kube-api-access-lpgms" (OuterVolumeSpecName: "kube-api-access-lpgms") pod "59148eea-351d-4d4c-ba60-e39e47372466" (UID: "59148eea-351d-4d4c-ba60-e39e47372466"). InnerVolumeSpecName "kube-api-access-lpgms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.253018 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "59148eea-351d-4d4c-ba60-e39e47372466" (UID: "59148eea-351d-4d4c-ba60-e39e47372466"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.277845 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "59148eea-351d-4d4c-ba60-e39e47372466" (UID: "59148eea-351d-4d4c-ba60-e39e47372466"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.290215 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-inventory" (OuterVolumeSpecName: "inventory") pod "59148eea-351d-4d4c-ba60-e39e47372466" (UID: "59148eea-351d-4d4c-ba60-e39e47372466"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.345788 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.345847 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.345860 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpgms\" (UniqueName: \"kubernetes.io/projected/59148eea-351d-4d4c-ba60-e39e47372466-kube-api-access-lpgms\") on node \"crc\" DevicePath \"\"" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.345890 4807 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59148eea-351d-4d4c-ba60-e39e47372466-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.674311 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" event={"ID":"59148eea-351d-4d4c-ba60-e39e47372466","Type":"ContainerDied","Data":"e373b7f5d4ddc9ab18d4efc78d08733d7daa278d7465a6de1cba2606488cfb95"} Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.674378 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e373b7f5d4ddc9ab18d4efc78d08733d7daa278d7465a6de1cba2606488cfb95" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.674437 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-74b28" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.756231 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp"] Dec 02 20:29:46 crc kubenswrapper[4807]: E1202 20:29:46.757083 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59148eea-351d-4d4c-ba60-e39e47372466" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.757191 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="59148eea-351d-4d4c-ba60-e39e47372466" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.757628 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="59148eea-351d-4d4c-ba60-e39e47372466" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.758695 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.762520 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.762846 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.763094 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.763300 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.766677 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp"] Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.857894 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3399e62b-d5c6-4469-9507-75e4e922201e-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp\" (UID: \"3399e62b-d5c6-4469-9507-75e4e922201e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.858115 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3399e62b-d5c6-4469-9507-75e4e922201e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp\" (UID: \"3399e62b-d5c6-4469-9507-75e4e922201e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.858549 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp87z\" (UniqueName: \"kubernetes.io/projected/3399e62b-d5c6-4469-9507-75e4e922201e-kube-api-access-bp87z\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp\" (UID: \"3399e62b-d5c6-4469-9507-75e4e922201e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.960676 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3399e62b-d5c6-4469-9507-75e4e922201e-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp\" (UID: \"3399e62b-d5c6-4469-9507-75e4e922201e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.960836 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3399e62b-d5c6-4469-9507-75e4e922201e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp\" (UID: \"3399e62b-d5c6-4469-9507-75e4e922201e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.960937 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp87z\" (UniqueName: \"kubernetes.io/projected/3399e62b-d5c6-4469-9507-75e4e922201e-kube-api-access-bp87z\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp\" (UID: \"3399e62b-d5c6-4469-9507-75e4e922201e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.967360 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3399e62b-d5c6-4469-9507-75e4e922201e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp\" (UID: \"3399e62b-d5c6-4469-9507-75e4e922201e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.967394 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3399e62b-d5c6-4469-9507-75e4e922201e-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp\" (UID: \"3399e62b-d5c6-4469-9507-75e4e922201e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" Dec 02 20:29:46 crc kubenswrapper[4807]: I1202 20:29:46.984433 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp87z\" (UniqueName: \"kubernetes.io/projected/3399e62b-d5c6-4469-9507-75e4e922201e-kube-api-access-bp87z\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp\" (UID: \"3399e62b-d5c6-4469-9507-75e4e922201e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" Dec 02 20:29:47 crc kubenswrapper[4807]: I1202 20:29:47.085005 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" Dec 02 20:29:47 crc kubenswrapper[4807]: I1202 20:29:47.873024 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp"] Dec 02 20:29:48 crc kubenswrapper[4807]: I1202 20:29:48.698854 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" event={"ID":"3399e62b-d5c6-4469-9507-75e4e922201e","Type":"ContainerStarted","Data":"ca150041785f065dc9d1e79a34e8d6b462c4b085984dad6659fae24457a9c233"} Dec 02 20:29:49 crc kubenswrapper[4807]: I1202 20:29:49.716919 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" event={"ID":"3399e62b-d5c6-4469-9507-75e4e922201e","Type":"ContainerStarted","Data":"53d355a17a2adc1a2bf70cf99bf991f9da87a0189cb9f61485b3d6875c1100b8"} Dec 02 20:29:49 crc kubenswrapper[4807]: I1202 20:29:49.757361 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" podStartSLOduration=3.158635849 podStartE2EDuration="3.757332726s" podCreationTimestamp="2025-12-02 20:29:46 +0000 UTC" firstStartedPulling="2025-12-02 20:29:47.876744503 +0000 UTC m=+1923.177652008" lastFinishedPulling="2025-12-02 20:29:48.47544139 +0000 UTC m=+1923.776348885" observedRunningTime="2025-12-02 20:29:49.739757306 +0000 UTC m=+1925.040664811" watchObservedRunningTime="2025-12-02 20:29:49.757332726 +0000 UTC m=+1925.058240221" Dec 02 20:29:53 crc kubenswrapper[4807]: I1202 20:29:53.048661 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-qr6rh"] Dec 02 20:29:53 crc kubenswrapper[4807]: I1202 20:29:53.063643 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-95a6-account-create-update-5dpp4"] Dec 02 20:29:53 crc kubenswrapper[4807]: I1202 20:29:53.075791 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f2a8-account-create-update-7v4b4"] Dec 02 20:29:53 crc kubenswrapper[4807]: I1202 20:29:53.086832 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-8r86t"] Dec 02 20:29:53 crc kubenswrapper[4807]: I1202 20:29:53.096898 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6097-account-create-update-jg6j9"] Dec 02 20:29:53 crc kubenswrapper[4807]: I1202 20:29:53.106425 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-wmhlb"] Dec 02 20:29:53 crc kubenswrapper[4807]: I1202 20:29:53.114738 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-95a6-account-create-update-5dpp4"] Dec 02 20:29:53 crc kubenswrapper[4807]: I1202 20:29:53.122455 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f2a8-account-create-update-7v4b4"] Dec 02 20:29:53 crc kubenswrapper[4807]: I1202 20:29:53.129783 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-8r86t"] Dec 02 20:29:53 crc kubenswrapper[4807]: I1202 20:29:53.137983 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-wmhlb"] Dec 02 20:29:53 crc kubenswrapper[4807]: I1202 20:29:53.145661 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-qr6rh"] Dec 02 20:29:53 crc kubenswrapper[4807]: I1202 20:29:53.154322 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6097-account-create-update-jg6j9"] Dec 02 20:29:54 crc kubenswrapper[4807]: I1202 20:29:54.987443 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1141d1f9-6d7f-46fa-8e85-511b8a0adddf" path="/var/lib/kubelet/pods/1141d1f9-6d7f-46fa-8e85-511b8a0adddf/volumes" Dec 02 20:29:54 crc kubenswrapper[4807]: I1202 20:29:54.991424 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="323b935c-8f72-4514-96c0-6edea05c6498" path="/var/lib/kubelet/pods/323b935c-8f72-4514-96c0-6edea05c6498/volumes" Dec 02 20:29:54 crc kubenswrapper[4807]: I1202 20:29:54.992339 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f1e4f9-4739-44a2-9257-389f73e77dfa" path="/var/lib/kubelet/pods/38f1e4f9-4739-44a2-9257-389f73e77dfa/volumes" Dec 02 20:29:54 crc kubenswrapper[4807]: I1202 20:29:54.993104 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa51800c-1f7b-4000-8547-c391f3a6dc6a" path="/var/lib/kubelet/pods/aa51800c-1f7b-4000-8547-c391f3a6dc6a/volumes" Dec 02 20:29:54 crc kubenswrapper[4807]: I1202 20:29:54.993806 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b0a41b-3334-4181-bad4-fd5cf030b010" path="/var/lib/kubelet/pods/e6b0a41b-3334-4181-bad4-fd5cf030b010/volumes" Dec 02 20:29:54 crc kubenswrapper[4807]: I1202 20:29:54.995206 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b198e4-d58c-4c1e-b6eb-35ecdeb30773" path="/var/lib/kubelet/pods/f9b198e4-d58c-4c1e-b6eb-35ecdeb30773/volumes" Dec 02 20:30:00 crc kubenswrapper[4807]: I1202 20:30:00.151965 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh"] Dec 02 20:30:00 crc kubenswrapper[4807]: I1202 20:30:00.154669 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh" Dec 02 20:30:00 crc kubenswrapper[4807]: I1202 20:30:00.157202 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 20:30:00 crc kubenswrapper[4807]: I1202 20:30:00.164307 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh"] Dec 02 20:30:00 crc kubenswrapper[4807]: I1202 20:30:00.167710 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 20:30:00 crc kubenswrapper[4807]: I1202 20:30:00.208679 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/769c589c-2259-4bbd-8766-92681831eccb-secret-volume\") pod \"collect-profiles-29411790-cxdwh\" (UID: \"769c589c-2259-4bbd-8766-92681831eccb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh" Dec 02 20:30:00 crc kubenswrapper[4807]: I1202 20:30:00.208815 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/769c589c-2259-4bbd-8766-92681831eccb-config-volume\") pod \"collect-profiles-29411790-cxdwh\" (UID: \"769c589c-2259-4bbd-8766-92681831eccb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh" Dec 02 20:30:00 crc kubenswrapper[4807]: I1202 20:30:00.208900 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-742c5\" (UniqueName: \"kubernetes.io/projected/769c589c-2259-4bbd-8766-92681831eccb-kube-api-access-742c5\") pod \"collect-profiles-29411790-cxdwh\" (UID: \"769c589c-2259-4bbd-8766-92681831eccb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh" Dec 02 20:30:00 crc kubenswrapper[4807]: I1202 20:30:00.310658 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/769c589c-2259-4bbd-8766-92681831eccb-secret-volume\") pod \"collect-profiles-29411790-cxdwh\" (UID: \"769c589c-2259-4bbd-8766-92681831eccb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh" Dec 02 20:30:00 crc kubenswrapper[4807]: I1202 20:30:00.311111 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/769c589c-2259-4bbd-8766-92681831eccb-config-volume\") pod \"collect-profiles-29411790-cxdwh\" (UID: \"769c589c-2259-4bbd-8766-92681831eccb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh" Dec 02 20:30:00 crc kubenswrapper[4807]: I1202 20:30:00.311396 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-742c5\" (UniqueName: \"kubernetes.io/projected/769c589c-2259-4bbd-8766-92681831eccb-kube-api-access-742c5\") pod \"collect-profiles-29411790-cxdwh\" (UID: \"769c589c-2259-4bbd-8766-92681831eccb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh" Dec 02 20:30:00 crc kubenswrapper[4807]: I1202 20:30:00.312284 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/769c589c-2259-4bbd-8766-92681831eccb-config-volume\") pod \"collect-profiles-29411790-cxdwh\" (UID: \"769c589c-2259-4bbd-8766-92681831eccb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh" Dec 02 20:30:00 crc kubenswrapper[4807]: I1202 20:30:00.319515 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/769c589c-2259-4bbd-8766-92681831eccb-secret-volume\") pod \"collect-profiles-29411790-cxdwh\" (UID: \"769c589c-2259-4bbd-8766-92681831eccb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh" Dec 02 20:30:00 crc kubenswrapper[4807]: I1202 20:30:00.345623 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-742c5\" (UniqueName: \"kubernetes.io/projected/769c589c-2259-4bbd-8766-92681831eccb-kube-api-access-742c5\") pod \"collect-profiles-29411790-cxdwh\" (UID: \"769c589c-2259-4bbd-8766-92681831eccb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh" Dec 02 20:30:00 crc kubenswrapper[4807]: I1202 20:30:00.483072 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh" Dec 02 20:30:00 crc kubenswrapper[4807]: I1202 20:30:00.948370 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh"] Dec 02 20:30:01 crc kubenswrapper[4807]: I1202 20:30:01.885831 4807 generic.go:334] "Generic (PLEG): container finished" podID="769c589c-2259-4bbd-8766-92681831eccb" containerID="739669dc3950619e2fe474531d7008051b6b1eb09596d83a1091dc2cbefdb502" exitCode=0 Dec 02 20:30:01 crc kubenswrapper[4807]: I1202 20:30:01.887425 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh" event={"ID":"769c589c-2259-4bbd-8766-92681831eccb","Type":"ContainerDied","Data":"739669dc3950619e2fe474531d7008051b6b1eb09596d83a1091dc2cbefdb502"} Dec 02 20:30:01 crc kubenswrapper[4807]: I1202 20:30:01.887561 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh" event={"ID":"769c589c-2259-4bbd-8766-92681831eccb","Type":"ContainerStarted","Data":"6b4db4c7b57dea3dfed145f829363384668016876ad58963e8bcf3bcc27b838a"} Dec 02 20:30:03 crc kubenswrapper[4807]: I1202 20:30:03.261183 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh" Dec 02 20:30:03 crc kubenswrapper[4807]: I1202 20:30:03.389970 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/769c589c-2259-4bbd-8766-92681831eccb-config-volume\") pod \"769c589c-2259-4bbd-8766-92681831eccb\" (UID: \"769c589c-2259-4bbd-8766-92681831eccb\") " Dec 02 20:30:03 crc kubenswrapper[4807]: I1202 20:30:03.390066 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-742c5\" (UniqueName: \"kubernetes.io/projected/769c589c-2259-4bbd-8766-92681831eccb-kube-api-access-742c5\") pod \"769c589c-2259-4bbd-8766-92681831eccb\" (UID: \"769c589c-2259-4bbd-8766-92681831eccb\") " Dec 02 20:30:03 crc kubenswrapper[4807]: I1202 20:30:03.390105 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/769c589c-2259-4bbd-8766-92681831eccb-secret-volume\") pod \"769c589c-2259-4bbd-8766-92681831eccb\" (UID: \"769c589c-2259-4bbd-8766-92681831eccb\") " Dec 02 20:30:03 crc kubenswrapper[4807]: I1202 20:30:03.390912 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/769c589c-2259-4bbd-8766-92681831eccb-config-volume" (OuterVolumeSpecName: "config-volume") pod "769c589c-2259-4bbd-8766-92681831eccb" (UID: "769c589c-2259-4bbd-8766-92681831eccb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:30:03 crc kubenswrapper[4807]: I1202 20:30:03.401961 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769c589c-2259-4bbd-8766-92681831eccb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "769c589c-2259-4bbd-8766-92681831eccb" (UID: "769c589c-2259-4bbd-8766-92681831eccb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:30:03 crc kubenswrapper[4807]: I1202 20:30:03.402825 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/769c589c-2259-4bbd-8766-92681831eccb-kube-api-access-742c5" (OuterVolumeSpecName: "kube-api-access-742c5") pod "769c589c-2259-4bbd-8766-92681831eccb" (UID: "769c589c-2259-4bbd-8766-92681831eccb"). InnerVolumeSpecName "kube-api-access-742c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:30:03 crc kubenswrapper[4807]: I1202 20:30:03.493172 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-742c5\" (UniqueName: \"kubernetes.io/projected/769c589c-2259-4bbd-8766-92681831eccb-kube-api-access-742c5\") on node \"crc\" DevicePath \"\"" Dec 02 20:30:03 crc kubenswrapper[4807]: I1202 20:30:03.493213 4807 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/769c589c-2259-4bbd-8766-92681831eccb-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 20:30:03 crc kubenswrapper[4807]: I1202 20:30:03.493223 4807 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/769c589c-2259-4bbd-8766-92681831eccb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 20:30:03 crc kubenswrapper[4807]: I1202 20:30:03.910235 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh" event={"ID":"769c589c-2259-4bbd-8766-92681831eccb","Type":"ContainerDied","Data":"6b4db4c7b57dea3dfed145f829363384668016876ad58963e8bcf3bcc27b838a"} Dec 02 20:30:03 crc kubenswrapper[4807]: I1202 20:30:03.910700 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b4db4c7b57dea3dfed145f829363384668016876ad58963e8bcf3bcc27b838a" Dec 02 20:30:03 crc kubenswrapper[4807]: I1202 20:30:03.910297 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh" Dec 02 20:30:09 crc kubenswrapper[4807]: I1202 20:30:09.142366 4807 scope.go:117] "RemoveContainer" containerID="86e6601427412c7c3af59e9e2879f74c1dc371d5461e4542966ab7f4b570ad15" Dec 02 20:30:09 crc kubenswrapper[4807]: I1202 20:30:09.177878 4807 scope.go:117] "RemoveContainer" containerID="84a679e22fe45118cdd97bc864b3daf34e2764b9ce124d28838391ed9d8b13ef" Dec 02 20:30:09 crc kubenswrapper[4807]: I1202 20:30:09.202898 4807 scope.go:117] "RemoveContainer" containerID="f653b180249a9a4e8bde1e96a28376e953a014feaa1d94c86eafbf91ac6c9f9b" Dec 02 20:30:09 crc kubenswrapper[4807]: I1202 20:30:09.256592 4807 scope.go:117] "RemoveContainer" containerID="d2557b6d0d009c3c22817ec1e240888b2382e6beb35d83125eb7aad25b7daa5e" Dec 02 20:30:09 crc kubenswrapper[4807]: I1202 20:30:09.282323 4807 scope.go:117] "RemoveContainer" containerID="0c17d948283b8bb8e0d783d58af0728007697c56538bac87a741940c6a06d0b2" Dec 02 20:30:09 crc kubenswrapper[4807]: I1202 20:30:09.392872 4807 scope.go:117] "RemoveContainer" containerID="e6f312279c514e140ef23636de13a4ac7c3610fd4b9f4b42203abff2be8720fa" Dec 02 20:30:09 crc kubenswrapper[4807]: I1202 20:30:09.448080 4807 scope.go:117] "RemoveContainer" containerID="11ad1232df83790d17009b657198dcfdbe9df3d1e34c183a2902338b24e2cdfd" Dec 02 20:30:09 crc kubenswrapper[4807]: I1202 20:30:09.504126 4807 scope.go:117] "RemoveContainer" containerID="8845d76c1471ee4fb8c57841130d72656f50db5f461f581032375ef4ff478128" Dec 02 20:30:09 crc kubenswrapper[4807]: I1202 20:30:09.552508 4807 scope.go:117] "RemoveContainer" containerID="e9630964cbde1ca10c955054abcc52a293ff06f9e399ab1398e81fa4c7d7c762" Dec 02 20:30:09 crc kubenswrapper[4807]: I1202 20:30:09.583336 4807 scope.go:117] "RemoveContainer" containerID="5df04e2fedee99698edbf2af49a68c5e8ed7ed34631fbe29a1cd9a7a62093528" Dec 02 20:30:09 crc kubenswrapper[4807]: I1202 20:30:09.604661 4807 scope.go:117] "RemoveContainer" containerID="e1ffc619cf349acbec0f5074bf5d5d110a88db30412c2e9259695fbd0b9e8a82" Dec 02 20:30:09 crc kubenswrapper[4807]: I1202 20:30:09.690024 4807 scope.go:117] "RemoveContainer" containerID="09b26364b38c93c98ab4dd264eac9bd59c3d1343ee873cda958c5e64c52890aa" Dec 02 20:30:09 crc kubenswrapper[4807]: I1202 20:30:09.752220 4807 scope.go:117] "RemoveContainer" containerID="651f490f87cfb537cb2f9e57d84f6548ef70d4778258b10a7e7998c5b7852171" Dec 02 20:30:09 crc kubenswrapper[4807]: I1202 20:30:09.811283 4807 scope.go:117] "RemoveContainer" containerID="2abdfb1d93c9081581c4ffc5d4699330cf0a2655b01fc364421929d2ead46544" Dec 02 20:30:09 crc kubenswrapper[4807]: I1202 20:30:09.857120 4807 scope.go:117] "RemoveContainer" containerID="9e84cbe403435241cff518569ceed84160782277981e5cccf2945cf119e89901" Dec 02 20:30:09 crc kubenswrapper[4807]: I1202 20:30:09.876348 4807 scope.go:117] "RemoveContainer" containerID="bbe035a07e060203525e8c15282e42ccc7c1178199d1ef52e3799ef95042bc15" Dec 02 20:30:10 crc kubenswrapper[4807]: I1202 20:30:10.059163 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-trkqq"] Dec 02 20:30:10 crc kubenswrapper[4807]: I1202 20:30:10.072423 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-trkqq"] Dec 02 20:30:10 crc kubenswrapper[4807]: I1202 20:30:10.985559 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53ba8115-bc84-45ea-804d-c29add38ee3a" path="/var/lib/kubelet/pods/53ba8115-bc84-45ea-804d-c29add38ee3a/volumes" Dec 02 20:30:20 crc kubenswrapper[4807]: I1202 20:30:20.146127 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-9qc25"] Dec 02 20:30:20 crc kubenswrapper[4807]: I1202 20:30:20.166800 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-9qc25"] Dec 02 20:30:20 crc kubenswrapper[4807]: I1202 20:30:20.988242 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="103c33fd-ac95-4efe-b1c2-b6c9187eb613" path="/var/lib/kubelet/pods/103c33fd-ac95-4efe-b1c2-b6c9187eb613/volumes" Dec 02 20:30:46 crc kubenswrapper[4807]: I1202 20:30:46.059771 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-vsnlc"] Dec 02 20:30:46 crc kubenswrapper[4807]: I1202 20:30:46.069680 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-vsnlc"] Dec 02 20:30:46 crc kubenswrapper[4807]: I1202 20:30:46.985496 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e35946b-d565-4354-9c86-8eb06b4ed154" path="/var/lib/kubelet/pods/5e35946b-d565-4354-9c86-8eb06b4ed154/volumes" Dec 02 20:30:57 crc kubenswrapper[4807]: I1202 20:30:57.043503 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-rp8rr"] Dec 02 20:30:57 crc kubenswrapper[4807]: I1202 20:30:57.057318 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-rp8rr"] Dec 02 20:30:58 crc kubenswrapper[4807]: I1202 20:30:58.995894 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="372e15f7-0eed-4b38-b4a9-19d3781e6e89" path="/var/lib/kubelet/pods/372e15f7-0eed-4b38-b4a9-19d3781e6e89/volumes" Dec 02 20:31:09 crc kubenswrapper[4807]: I1202 20:31:09.054249 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-tgqtm"] Dec 02 20:31:09 crc kubenswrapper[4807]: I1202 20:31:09.064662 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vm4lf"] Dec 02 20:31:09 crc kubenswrapper[4807]: I1202 20:31:09.076609 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-tgqtm"] Dec 02 20:31:09 crc kubenswrapper[4807]: I1202 20:31:09.085964 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vm4lf"] Dec 02 20:31:10 crc kubenswrapper[4807]: I1202 20:31:10.221362 4807 scope.go:117] "RemoveContainer" containerID="89f4e947d7dd0ac185fabb253543b0ee6fc90bf5bf9fc4172e826a69c4f7f044" Dec 02 20:31:10 crc kubenswrapper[4807]: I1202 20:31:10.267954 4807 scope.go:117] "RemoveContainer" containerID="c6f655e62bc881fdb84a50db51f44c7de530390cd4b3965f1d2c5e41583e4ac4" Dec 02 20:31:10 crc kubenswrapper[4807]: I1202 20:31:10.361437 4807 scope.go:117] "RemoveContainer" containerID="f9e09eea3225b6ab9506ff5d6c1236d0d7389d54ac92d17cffdc71838b7d48d0" Dec 02 20:31:10 crc kubenswrapper[4807]: I1202 20:31:10.431374 4807 scope.go:117] "RemoveContainer" containerID="57c32b143dbf76bcc9aceaabf44ea7e4c6f20502a7a47d4ec95ecb1f7e2e7dbe" Dec 02 20:31:10 crc kubenswrapper[4807]: I1202 20:31:10.988371 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7183d5bc-237d-4fd9-8d9c-31ccc6c46afe" path="/var/lib/kubelet/pods/7183d5bc-237d-4fd9-8d9c-31ccc6c46afe/volumes" Dec 02 20:31:10 crc kubenswrapper[4807]: I1202 20:31:10.989173 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9af9ee27-367c-4051-95bd-78ede0827b19" path="/var/lib/kubelet/pods/9af9ee27-367c-4051-95bd-78ede0827b19/volumes" Dec 02 20:31:20 crc kubenswrapper[4807]: I1202 20:31:20.043807 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-bppsk"] Dec 02 20:31:20 crc kubenswrapper[4807]: I1202 20:31:20.059633 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-bppsk"] Dec 02 20:31:20 crc kubenswrapper[4807]: I1202 20:31:20.991794 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b83e24a-ce7d-42b1-998f-1ede619914ff" path="/var/lib/kubelet/pods/9b83e24a-ce7d-42b1-998f-1ede619914ff/volumes" Dec 02 20:31:37 crc kubenswrapper[4807]: I1202 20:31:37.180657 4807 generic.go:334] "Generic (PLEG): container finished" podID="3399e62b-d5c6-4469-9507-75e4e922201e" containerID="53d355a17a2adc1a2bf70cf99bf991f9da87a0189cb9f61485b3d6875c1100b8" exitCode=0 Dec 02 20:31:37 crc kubenswrapper[4807]: I1202 20:31:37.181463 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" event={"ID":"3399e62b-d5c6-4469-9507-75e4e922201e","Type":"ContainerDied","Data":"53d355a17a2adc1a2bf70cf99bf991f9da87a0189cb9f61485b3d6875c1100b8"} Dec 02 20:31:38 crc kubenswrapper[4807]: I1202 20:31:38.649850 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" Dec 02 20:31:38 crc kubenswrapper[4807]: I1202 20:31:38.661988 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3399e62b-d5c6-4469-9507-75e4e922201e-inventory\") pod \"3399e62b-d5c6-4469-9507-75e4e922201e\" (UID: \"3399e62b-d5c6-4469-9507-75e4e922201e\") " Dec 02 20:31:38 crc kubenswrapper[4807]: I1202 20:31:38.662376 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3399e62b-d5c6-4469-9507-75e4e922201e-ssh-key\") pod \"3399e62b-d5c6-4469-9507-75e4e922201e\" (UID: \"3399e62b-d5c6-4469-9507-75e4e922201e\") " Dec 02 20:31:38 crc kubenswrapper[4807]: I1202 20:31:38.662548 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp87z\" (UniqueName: \"kubernetes.io/projected/3399e62b-d5c6-4469-9507-75e4e922201e-kube-api-access-bp87z\") pod \"3399e62b-d5c6-4469-9507-75e4e922201e\" (UID: \"3399e62b-d5c6-4469-9507-75e4e922201e\") " Dec 02 20:31:38 crc kubenswrapper[4807]: I1202 20:31:38.674112 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3399e62b-d5c6-4469-9507-75e4e922201e-kube-api-access-bp87z" (OuterVolumeSpecName: "kube-api-access-bp87z") pod "3399e62b-d5c6-4469-9507-75e4e922201e" (UID: "3399e62b-d5c6-4469-9507-75e4e922201e"). InnerVolumeSpecName "kube-api-access-bp87z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:31:38 crc kubenswrapper[4807]: I1202 20:31:38.705551 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3399e62b-d5c6-4469-9507-75e4e922201e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3399e62b-d5c6-4469-9507-75e4e922201e" (UID: "3399e62b-d5c6-4469-9507-75e4e922201e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:31:38 crc kubenswrapper[4807]: I1202 20:31:38.712798 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3399e62b-d5c6-4469-9507-75e4e922201e-inventory" (OuterVolumeSpecName: "inventory") pod "3399e62b-d5c6-4469-9507-75e4e922201e" (UID: "3399e62b-d5c6-4469-9507-75e4e922201e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:31:38 crc kubenswrapper[4807]: I1202 20:31:38.765765 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp87z\" (UniqueName: \"kubernetes.io/projected/3399e62b-d5c6-4469-9507-75e4e922201e-kube-api-access-bp87z\") on node \"crc\" DevicePath \"\"" Dec 02 20:31:38 crc kubenswrapper[4807]: I1202 20:31:38.765799 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3399e62b-d5c6-4469-9507-75e4e922201e-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 20:31:38 crc kubenswrapper[4807]: I1202 20:31:38.765809 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3399e62b-d5c6-4469-9507-75e4e922201e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.200826 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" event={"ID":"3399e62b-d5c6-4469-9507-75e4e922201e","Type":"ContainerDied","Data":"ca150041785f065dc9d1e79a34e8d6b462c4b085984dad6659fae24457a9c233"} Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.201279 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca150041785f065dc9d1e79a34e8d6b462c4b085984dad6659fae24457a9c233" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.200962 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.329570 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm"] Dec 02 20:31:39 crc kubenswrapper[4807]: E1202 20:31:39.330503 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="769c589c-2259-4bbd-8766-92681831eccb" containerName="collect-profiles" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.330538 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="769c589c-2259-4bbd-8766-92681831eccb" containerName="collect-profiles" Dec 02 20:31:39 crc kubenswrapper[4807]: E1202 20:31:39.330599 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3399e62b-d5c6-4469-9507-75e4e922201e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.330613 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="3399e62b-d5c6-4469-9507-75e4e922201e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.331162 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="3399e62b-d5c6-4469-9507-75e4e922201e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.331213 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="769c589c-2259-4bbd-8766-92681831eccb" containerName="collect-profiles" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.332538 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.338976 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm"] Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.340799 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.340815 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.340839 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.341535 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.481134 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm\" (UID: \"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.481393 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm\" (UID: \"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.481929 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l9k2\" (UniqueName: \"kubernetes.io/projected/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-kube-api-access-5l9k2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm\" (UID: \"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.585170 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l9k2\" (UniqueName: \"kubernetes.io/projected/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-kube-api-access-5l9k2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm\" (UID: \"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.585351 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm\" (UID: \"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.586202 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm\" (UID: \"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.589575 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm\" (UID: \"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.591338 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm\" (UID: \"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.607355 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l9k2\" (UniqueName: \"kubernetes.io/projected/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-kube-api-access-5l9k2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm\" (UID: \"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" Dec 02 20:31:39 crc kubenswrapper[4807]: I1202 20:31:39.670414 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" Dec 02 20:31:40 crc kubenswrapper[4807]: I1202 20:31:40.251126 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm"] Dec 02 20:31:40 crc kubenswrapper[4807]: I1202 20:31:40.261656 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 20:31:41 crc kubenswrapper[4807]: I1202 20:31:41.058046 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-zlf57"] Dec 02 20:31:41 crc kubenswrapper[4807]: I1202 20:31:41.075169 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-zlf57"] Dec 02 20:31:41 crc kubenswrapper[4807]: I1202 20:31:41.234759 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" event={"ID":"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf","Type":"ContainerStarted","Data":"30731a555362898495bb50571cf77167d228125956c9fe86d1a08c8c70463266"} Dec 02 20:31:41 crc kubenswrapper[4807]: I1202 20:31:41.234814 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" event={"ID":"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf","Type":"ContainerStarted","Data":"0f4725ad695f8202737a7dee6e23afc3eb20db4adb5b5154ab88e368525a9432"} Dec 02 20:31:41 crc kubenswrapper[4807]: I1202 20:31:41.259315 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" podStartSLOduration=1.616551249 podStartE2EDuration="2.259287434s" podCreationTimestamp="2025-12-02 20:31:39 +0000 UTC" firstStartedPulling="2025-12-02 20:31:40.261343899 +0000 UTC m=+2035.562251404" lastFinishedPulling="2025-12-02 20:31:40.904080094 +0000 UTC m=+2036.204987589" observedRunningTime="2025-12-02 20:31:41.25086619 +0000 UTC m=+2036.551773705" watchObservedRunningTime="2025-12-02 20:31:41.259287434 +0000 UTC m=+2036.560194929" Dec 02 20:31:42 crc kubenswrapper[4807]: I1202 20:31:42.983962 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b4b9175-26ae-4cff-8dd2-7682b1408271" path="/var/lib/kubelet/pods/2b4b9175-26ae-4cff-8dd2-7682b1408271/volumes" Dec 02 20:31:58 crc kubenswrapper[4807]: I1202 20:31:58.293183 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:31:58 crc kubenswrapper[4807]: I1202 20:31:58.294593 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:32:10 crc kubenswrapper[4807]: I1202 20:32:10.619787 4807 scope.go:117] "RemoveContainer" containerID="7c3bdef80b4682d971604e3dd3033e145a226709e1bb76a18169004f91d6b88e" Dec 02 20:32:10 crc kubenswrapper[4807]: I1202 20:32:10.664203 4807 scope.go:117] "RemoveContainer" containerID="ebf674255993b10f848b561bd58bf084a292128e378892fafe3b9572e0f922c8" Dec 02 20:32:10 crc kubenswrapper[4807]: I1202 20:32:10.740576 4807 scope.go:117] "RemoveContainer" containerID="1c15e13888a808881cccbd1943409e7a45ae2319eb7b3fd3c184ddcad2f2a202" Dec 02 20:32:10 crc kubenswrapper[4807]: I1202 20:32:10.777027 4807 scope.go:117] "RemoveContainer" containerID="056b97430d9365f074a446ca366ddf09e067626066666459ccce07d819f95502" Dec 02 20:32:27 crc kubenswrapper[4807]: I1202 20:32:27.081769 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4f38-account-create-update-wbhvj"] Dec 02 20:32:27 crc kubenswrapper[4807]: I1202 20:32:27.101938 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-fb4f-account-create-update-rnlg5"] Dec 02 20:32:27 crc kubenswrapper[4807]: I1202 20:32:27.115248 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-ljt26"] Dec 02 20:32:27 crc kubenswrapper[4807]: I1202 20:32:27.124279 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-fch86"] Dec 02 20:32:27 crc kubenswrapper[4807]: I1202 20:32:27.134696 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-fe50-account-create-update-rpfxl"] Dec 02 20:32:27 crc kubenswrapper[4807]: I1202 20:32:27.142475 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-fb4f-account-create-update-rnlg5"] Dec 02 20:32:27 crc kubenswrapper[4807]: I1202 20:32:27.150204 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-fch86"] Dec 02 20:32:27 crc kubenswrapper[4807]: I1202 20:32:27.157777 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-fe50-account-create-update-rpfxl"] Dec 02 20:32:27 crc kubenswrapper[4807]: I1202 20:32:27.165325 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4f38-account-create-update-wbhvj"] Dec 02 20:32:27 crc kubenswrapper[4807]: I1202 20:32:27.172878 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-ljt26"] Dec 02 20:32:27 crc kubenswrapper[4807]: I1202 20:32:27.180356 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-s4gh8"] Dec 02 20:32:27 crc kubenswrapper[4807]: I1202 20:32:27.186886 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-s4gh8"] Dec 02 20:32:28 crc kubenswrapper[4807]: I1202 20:32:28.293836 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:32:28 crc kubenswrapper[4807]: I1202 20:32:28.293929 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:32:28 crc kubenswrapper[4807]: I1202 20:32:28.987556 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04ab6b01-20b7-4320-aa91-2deecbdac66a" path="/var/lib/kubelet/pods/04ab6b01-20b7-4320-aa91-2deecbdac66a/volumes" Dec 02 20:32:28 crc kubenswrapper[4807]: I1202 20:32:28.989690 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54376bb6-4987-4ef6-8814-51fa5f95e7bb" path="/var/lib/kubelet/pods/54376bb6-4987-4ef6-8814-51fa5f95e7bb/volumes" Dec 02 20:32:28 crc kubenswrapper[4807]: I1202 20:32:28.991423 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bde51b6-13ef-4b4b-b126-b3effc7dc8ea" path="/var/lib/kubelet/pods/7bde51b6-13ef-4b4b-b126-b3effc7dc8ea/volumes" Dec 02 20:32:28 crc kubenswrapper[4807]: I1202 20:32:28.993676 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c1e20d0-372b-45be-9e7a-eed8a5264b08" path="/var/lib/kubelet/pods/7c1e20d0-372b-45be-9e7a-eed8a5264b08/volumes" Dec 02 20:32:28 crc kubenswrapper[4807]: I1202 20:32:28.996631 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac62b3e0-8a0f-4953-b543-4f699b3bb7cb" path="/var/lib/kubelet/pods/ac62b3e0-8a0f-4953-b543-4f699b3bb7cb/volumes" Dec 02 20:32:28 crc kubenswrapper[4807]: I1202 20:32:28.997902 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0864d50-612d-47a3-bc0f-59883303dedf" path="/var/lib/kubelet/pods/d0864d50-612d-47a3-bc0f-59883303dedf/volumes" Dec 02 20:32:45 crc kubenswrapper[4807]: I1202 20:32:45.088872 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-567g6"] Dec 02 20:32:45 crc kubenswrapper[4807]: I1202 20:32:45.092753 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:45 crc kubenswrapper[4807]: I1202 20:32:45.106639 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-567g6"] Dec 02 20:32:45 crc kubenswrapper[4807]: I1202 20:32:45.195144 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bzcm\" (UniqueName: \"kubernetes.io/projected/276d433e-e37e-494a-9d72-12e9a107bafc-kube-api-access-2bzcm\") pod \"certified-operators-567g6\" (UID: \"276d433e-e37e-494a-9d72-12e9a107bafc\") " pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:45 crc kubenswrapper[4807]: I1202 20:32:45.195236 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276d433e-e37e-494a-9d72-12e9a107bafc-catalog-content\") pod \"certified-operators-567g6\" (UID: \"276d433e-e37e-494a-9d72-12e9a107bafc\") " pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:45 crc kubenswrapper[4807]: I1202 20:32:45.195345 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276d433e-e37e-494a-9d72-12e9a107bafc-utilities\") pod \"certified-operators-567g6\" (UID: \"276d433e-e37e-494a-9d72-12e9a107bafc\") " pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:45 crc kubenswrapper[4807]: I1202 20:32:45.297837 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bzcm\" (UniqueName: \"kubernetes.io/projected/276d433e-e37e-494a-9d72-12e9a107bafc-kube-api-access-2bzcm\") pod \"certified-operators-567g6\" (UID: \"276d433e-e37e-494a-9d72-12e9a107bafc\") " pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:45 crc kubenswrapper[4807]: I1202 20:32:45.297948 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276d433e-e37e-494a-9d72-12e9a107bafc-catalog-content\") pod \"certified-operators-567g6\" (UID: \"276d433e-e37e-494a-9d72-12e9a107bafc\") " pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:45 crc kubenswrapper[4807]: I1202 20:32:45.298064 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276d433e-e37e-494a-9d72-12e9a107bafc-utilities\") pod \"certified-operators-567g6\" (UID: \"276d433e-e37e-494a-9d72-12e9a107bafc\") " pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:45 crc kubenswrapper[4807]: I1202 20:32:45.298478 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276d433e-e37e-494a-9d72-12e9a107bafc-catalog-content\") pod \"certified-operators-567g6\" (UID: \"276d433e-e37e-494a-9d72-12e9a107bafc\") " pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:45 crc kubenswrapper[4807]: I1202 20:32:45.298538 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276d433e-e37e-494a-9d72-12e9a107bafc-utilities\") pod \"certified-operators-567g6\" (UID: \"276d433e-e37e-494a-9d72-12e9a107bafc\") " pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:45 crc kubenswrapper[4807]: I1202 20:32:45.318843 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bzcm\" (UniqueName: \"kubernetes.io/projected/276d433e-e37e-494a-9d72-12e9a107bafc-kube-api-access-2bzcm\") pod \"certified-operators-567g6\" (UID: \"276d433e-e37e-494a-9d72-12e9a107bafc\") " pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:45 crc kubenswrapper[4807]: I1202 20:32:45.438040 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:45 crc kubenswrapper[4807]: I1202 20:32:45.967808 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-567g6"] Dec 02 20:32:46 crc kubenswrapper[4807]: I1202 20:32:46.982146 4807 generic.go:334] "Generic (PLEG): container finished" podID="276d433e-e37e-494a-9d72-12e9a107bafc" containerID="913cdac7f8c38747186b8971b6ea16e2ce8a3ef81b2bab69468bcb2ab6d909b6" exitCode=0 Dec 02 20:32:47 crc kubenswrapper[4807]: I1202 20:32:47.009607 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567g6" event={"ID":"276d433e-e37e-494a-9d72-12e9a107bafc","Type":"ContainerDied","Data":"913cdac7f8c38747186b8971b6ea16e2ce8a3ef81b2bab69468bcb2ab6d909b6"} Dec 02 20:32:47 crc kubenswrapper[4807]: I1202 20:32:47.009681 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567g6" event={"ID":"276d433e-e37e-494a-9d72-12e9a107bafc","Type":"ContainerStarted","Data":"cb72387b68c74d307338cb1c8e07fbb74335b9ba9f11454a940d9fc9608a2d68"} Dec 02 20:32:47 crc kubenswrapper[4807]: I1202 20:32:47.998088 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567g6" event={"ID":"276d433e-e37e-494a-9d72-12e9a107bafc","Type":"ContainerStarted","Data":"728ae1a3a650c10c076bbae6c06e04d4076c527ca46772610b7d7d3de5b4c73f"} Dec 02 20:32:49 crc kubenswrapper[4807]: I1202 20:32:49.017593 4807 generic.go:334] "Generic (PLEG): container finished" podID="276d433e-e37e-494a-9d72-12e9a107bafc" containerID="728ae1a3a650c10c076bbae6c06e04d4076c527ca46772610b7d7d3de5b4c73f" exitCode=0 Dec 02 20:32:49 crc kubenswrapper[4807]: I1202 20:32:49.017676 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567g6" event={"ID":"276d433e-e37e-494a-9d72-12e9a107bafc","Type":"ContainerDied","Data":"728ae1a3a650c10c076bbae6c06e04d4076c527ca46772610b7d7d3de5b4c73f"} Dec 02 20:32:50 crc kubenswrapper[4807]: I1202 20:32:50.029869 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567g6" event={"ID":"276d433e-e37e-494a-9d72-12e9a107bafc","Type":"ContainerStarted","Data":"360ff7966e4aeb8f60510977158925888643f4ea34ac10ae348903ba9a319709"} Dec 02 20:32:50 crc kubenswrapper[4807]: I1202 20:32:50.051712 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-567g6" podStartSLOduration=2.5617399020000002 podStartE2EDuration="5.051687243s" podCreationTimestamp="2025-12-02 20:32:45 +0000 UTC" firstStartedPulling="2025-12-02 20:32:46.98804041 +0000 UTC m=+2102.288947915" lastFinishedPulling="2025-12-02 20:32:49.477987731 +0000 UTC m=+2104.778895256" observedRunningTime="2025-12-02 20:32:50.049374836 +0000 UTC m=+2105.350282421" watchObservedRunningTime="2025-12-02 20:32:50.051687243 +0000 UTC m=+2105.352594738" Dec 02 20:32:55 crc kubenswrapper[4807]: I1202 20:32:55.439394 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:55 crc kubenswrapper[4807]: I1202 20:32:55.440393 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:55 crc kubenswrapper[4807]: I1202 20:32:55.532902 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:56 crc kubenswrapper[4807]: I1202 20:32:56.164908 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:56 crc kubenswrapper[4807]: I1202 20:32:56.224344 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-567g6"] Dec 02 20:32:57 crc kubenswrapper[4807]: I1202 20:32:57.074375 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hrk4p"] Dec 02 20:32:57 crc kubenswrapper[4807]: I1202 20:32:57.086327 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hrk4p"] Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.136829 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-567g6" podUID="276d433e-e37e-494a-9d72-12e9a107bafc" containerName="registry-server" containerID="cri-o://360ff7966e4aeb8f60510977158925888643f4ea34ac10ae348903ba9a319709" gracePeriod=2 Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.293641 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.294065 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.294134 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.295355 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"083b747738a7b266cd838719e592adc712e0ead5027434ea8e6c8467bbbf14c4"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.295448 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://083b747738a7b266cd838719e592adc712e0ead5027434ea8e6c8467bbbf14c4" gracePeriod=600 Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.645440 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.787515 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276d433e-e37e-494a-9d72-12e9a107bafc-catalog-content\") pod \"276d433e-e37e-494a-9d72-12e9a107bafc\" (UID: \"276d433e-e37e-494a-9d72-12e9a107bafc\") " Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.787780 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bzcm\" (UniqueName: \"kubernetes.io/projected/276d433e-e37e-494a-9d72-12e9a107bafc-kube-api-access-2bzcm\") pod \"276d433e-e37e-494a-9d72-12e9a107bafc\" (UID: \"276d433e-e37e-494a-9d72-12e9a107bafc\") " Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.787851 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276d433e-e37e-494a-9d72-12e9a107bafc-utilities\") pod \"276d433e-e37e-494a-9d72-12e9a107bafc\" (UID: \"276d433e-e37e-494a-9d72-12e9a107bafc\") " Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.788678 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/276d433e-e37e-494a-9d72-12e9a107bafc-utilities" (OuterVolumeSpecName: "utilities") pod "276d433e-e37e-494a-9d72-12e9a107bafc" (UID: "276d433e-e37e-494a-9d72-12e9a107bafc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.795548 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/276d433e-e37e-494a-9d72-12e9a107bafc-kube-api-access-2bzcm" (OuterVolumeSpecName: "kube-api-access-2bzcm") pod "276d433e-e37e-494a-9d72-12e9a107bafc" (UID: "276d433e-e37e-494a-9d72-12e9a107bafc"). InnerVolumeSpecName "kube-api-access-2bzcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.844508 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/276d433e-e37e-494a-9d72-12e9a107bafc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "276d433e-e37e-494a-9d72-12e9a107bafc" (UID: "276d433e-e37e-494a-9d72-12e9a107bafc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.890744 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bzcm\" (UniqueName: \"kubernetes.io/projected/276d433e-e37e-494a-9d72-12e9a107bafc-kube-api-access-2bzcm\") on node \"crc\" DevicePath \"\"" Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.890776 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276d433e-e37e-494a-9d72-12e9a107bafc-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.890785 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276d433e-e37e-494a-9d72-12e9a107bafc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:32:58 crc kubenswrapper[4807]: I1202 20:32:58.983531 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d53985-4c04-4e9c-be07-b866ac014640" path="/var/lib/kubelet/pods/76d53985-4c04-4e9c-be07-b866ac014640/volumes" Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.159134 4807 generic.go:334] "Generic (PLEG): container finished" podID="276d433e-e37e-494a-9d72-12e9a107bafc" containerID="360ff7966e4aeb8f60510977158925888643f4ea34ac10ae348903ba9a319709" exitCode=0 Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.159218 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567g6" event={"ID":"276d433e-e37e-494a-9d72-12e9a107bafc","Type":"ContainerDied","Data":"360ff7966e4aeb8f60510977158925888643f4ea34ac10ae348903ba9a319709"} Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.159252 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567g6" event={"ID":"276d433e-e37e-494a-9d72-12e9a107bafc","Type":"ContainerDied","Data":"cb72387b68c74d307338cb1c8e07fbb74335b9ba9f11454a940d9fc9608a2d68"} Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.159275 4807 scope.go:117] "RemoveContainer" containerID="360ff7966e4aeb8f60510977158925888643f4ea34ac10ae348903ba9a319709" Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.159431 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-567g6" Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.165028 4807 generic.go:334] "Generic (PLEG): container finished" podID="5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf" containerID="30731a555362898495bb50571cf77167d228125956c9fe86d1a08c8c70463266" exitCode=0 Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.165111 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" event={"ID":"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf","Type":"ContainerDied","Data":"30731a555362898495bb50571cf77167d228125956c9fe86d1a08c8c70463266"} Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.171855 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="083b747738a7b266cd838719e592adc712e0ead5027434ea8e6c8467bbbf14c4" exitCode=0 Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.171949 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"083b747738a7b266cd838719e592adc712e0ead5027434ea8e6c8467bbbf14c4"} Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.171994 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4"} Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.192998 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-567g6"] Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.200517 4807 scope.go:117] "RemoveContainer" containerID="728ae1a3a650c10c076bbae6c06e04d4076c527ca46772610b7d7d3de5b4c73f" Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.203091 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-567g6"] Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.225239 4807 scope.go:117] "RemoveContainer" containerID="913cdac7f8c38747186b8971b6ea16e2ce8a3ef81b2bab69468bcb2ab6d909b6" Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.246340 4807 scope.go:117] "RemoveContainer" containerID="360ff7966e4aeb8f60510977158925888643f4ea34ac10ae348903ba9a319709" Dec 02 20:32:59 crc kubenswrapper[4807]: E1202 20:32:59.246849 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"360ff7966e4aeb8f60510977158925888643f4ea34ac10ae348903ba9a319709\": container with ID starting with 360ff7966e4aeb8f60510977158925888643f4ea34ac10ae348903ba9a319709 not found: ID does not exist" containerID="360ff7966e4aeb8f60510977158925888643f4ea34ac10ae348903ba9a319709" Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.246933 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"360ff7966e4aeb8f60510977158925888643f4ea34ac10ae348903ba9a319709"} err="failed to get container status \"360ff7966e4aeb8f60510977158925888643f4ea34ac10ae348903ba9a319709\": rpc error: code = NotFound desc = could not find container \"360ff7966e4aeb8f60510977158925888643f4ea34ac10ae348903ba9a319709\": container with ID starting with 360ff7966e4aeb8f60510977158925888643f4ea34ac10ae348903ba9a319709 not found: ID does not exist" Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.247001 4807 scope.go:117] "RemoveContainer" containerID="728ae1a3a650c10c076bbae6c06e04d4076c527ca46772610b7d7d3de5b4c73f" Dec 02 20:32:59 crc kubenswrapper[4807]: E1202 20:32:59.248853 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"728ae1a3a650c10c076bbae6c06e04d4076c527ca46772610b7d7d3de5b4c73f\": container with ID starting with 728ae1a3a650c10c076bbae6c06e04d4076c527ca46772610b7d7d3de5b4c73f not found: ID does not exist" containerID="728ae1a3a650c10c076bbae6c06e04d4076c527ca46772610b7d7d3de5b4c73f" Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.248885 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"728ae1a3a650c10c076bbae6c06e04d4076c527ca46772610b7d7d3de5b4c73f"} err="failed to get container status \"728ae1a3a650c10c076bbae6c06e04d4076c527ca46772610b7d7d3de5b4c73f\": rpc error: code = NotFound desc = could not find container \"728ae1a3a650c10c076bbae6c06e04d4076c527ca46772610b7d7d3de5b4c73f\": container with ID starting with 728ae1a3a650c10c076bbae6c06e04d4076c527ca46772610b7d7d3de5b4c73f not found: ID does not exist" Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.248904 4807 scope.go:117] "RemoveContainer" containerID="913cdac7f8c38747186b8971b6ea16e2ce8a3ef81b2bab69468bcb2ab6d909b6" Dec 02 20:32:59 crc kubenswrapper[4807]: E1202 20:32:59.249560 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"913cdac7f8c38747186b8971b6ea16e2ce8a3ef81b2bab69468bcb2ab6d909b6\": container with ID starting with 913cdac7f8c38747186b8971b6ea16e2ce8a3ef81b2bab69468bcb2ab6d909b6 not found: ID does not exist" containerID="913cdac7f8c38747186b8971b6ea16e2ce8a3ef81b2bab69468bcb2ab6d909b6" Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.249596 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913cdac7f8c38747186b8971b6ea16e2ce8a3ef81b2bab69468bcb2ab6d909b6"} err="failed to get container status \"913cdac7f8c38747186b8971b6ea16e2ce8a3ef81b2bab69468bcb2ab6d909b6\": rpc error: code = NotFound desc = could not find container \"913cdac7f8c38747186b8971b6ea16e2ce8a3ef81b2bab69468bcb2ab6d909b6\": container with ID starting with 913cdac7f8c38747186b8971b6ea16e2ce8a3ef81b2bab69468bcb2ab6d909b6 not found: ID does not exist" Dec 02 20:32:59 crc kubenswrapper[4807]: I1202 20:32:59.249614 4807 scope.go:117] "RemoveContainer" containerID="5eea317392f5da9beed6a862822e1566ce1c0ebe45aed8675fd2378ec6c38129" Dec 02 20:33:00 crc kubenswrapper[4807]: I1202 20:33:00.669885 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" Dec 02 20:33:00 crc kubenswrapper[4807]: I1202 20:33:00.744911 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l9k2\" (UniqueName: \"kubernetes.io/projected/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-kube-api-access-5l9k2\") pod \"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf\" (UID: \"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf\") " Dec 02 20:33:00 crc kubenswrapper[4807]: I1202 20:33:00.745020 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-inventory\") pod \"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf\" (UID: \"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf\") " Dec 02 20:33:00 crc kubenswrapper[4807]: I1202 20:33:00.745282 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-ssh-key\") pod \"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf\" (UID: \"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf\") " Dec 02 20:33:00 crc kubenswrapper[4807]: I1202 20:33:00.752260 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-kube-api-access-5l9k2" (OuterVolumeSpecName: "kube-api-access-5l9k2") pod "5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf" (UID: "5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf"). InnerVolumeSpecName "kube-api-access-5l9k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:33:00 crc kubenswrapper[4807]: I1202 20:33:00.775622 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-inventory" (OuterVolumeSpecName: "inventory") pod "5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf" (UID: "5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:33:00 crc kubenswrapper[4807]: I1202 20:33:00.777125 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf" (UID: "5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:33:00 crc kubenswrapper[4807]: I1202 20:33:00.847997 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l9k2\" (UniqueName: \"kubernetes.io/projected/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-kube-api-access-5l9k2\") on node \"crc\" DevicePath \"\"" Dec 02 20:33:00 crc kubenswrapper[4807]: I1202 20:33:00.848051 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 20:33:00 crc kubenswrapper[4807]: I1202 20:33:00.848076 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:33:00 crc kubenswrapper[4807]: I1202 20:33:00.993444 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="276d433e-e37e-494a-9d72-12e9a107bafc" path="/var/lib/kubelet/pods/276d433e-e37e-494a-9d72-12e9a107bafc/volumes" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.207792 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" event={"ID":"5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf","Type":"ContainerDied","Data":"0f4725ad695f8202737a7dee6e23afc3eb20db4adb5b5154ab88e368525a9432"} Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.208148 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f4725ad695f8202737a7dee6e23afc3eb20db4adb5b5154ab88e368525a9432" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.207875 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.348012 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj"] Dec 02 20:33:01 crc kubenswrapper[4807]: E1202 20:33:01.348564 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="276d433e-e37e-494a-9d72-12e9a107bafc" containerName="registry-server" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.348582 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="276d433e-e37e-494a-9d72-12e9a107bafc" containerName="registry-server" Dec 02 20:33:01 crc kubenswrapper[4807]: E1202 20:33:01.348613 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="276d433e-e37e-494a-9d72-12e9a107bafc" containerName="extract-utilities" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.348621 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="276d433e-e37e-494a-9d72-12e9a107bafc" containerName="extract-utilities" Dec 02 20:33:01 crc kubenswrapper[4807]: E1202 20:33:01.348642 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="276d433e-e37e-494a-9d72-12e9a107bafc" containerName="extract-content" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.348650 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="276d433e-e37e-494a-9d72-12e9a107bafc" containerName="extract-content" Dec 02 20:33:01 crc kubenswrapper[4807]: E1202 20:33:01.348666 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.348674 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.348880 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.348896 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="276d433e-e37e-494a-9d72-12e9a107bafc" containerName="registry-server" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.349657 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.351819 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.351989 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.352082 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.360104 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.377666 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj"] Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.461093 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwr5b\" (UniqueName: \"kubernetes.io/projected/f741a266-b127-46cc-8304-9aedd57f07b5-kube-api-access-nwr5b\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj\" (UID: \"f741a266-b127-46cc-8304-9aedd57f07b5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.461562 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f741a266-b127-46cc-8304-9aedd57f07b5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj\" (UID: \"f741a266-b127-46cc-8304-9aedd57f07b5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.461801 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f741a266-b127-46cc-8304-9aedd57f07b5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj\" (UID: \"f741a266-b127-46cc-8304-9aedd57f07b5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.563907 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f741a266-b127-46cc-8304-9aedd57f07b5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj\" (UID: \"f741a266-b127-46cc-8304-9aedd57f07b5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.564056 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f741a266-b127-46cc-8304-9aedd57f07b5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj\" (UID: \"f741a266-b127-46cc-8304-9aedd57f07b5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.564233 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwr5b\" (UniqueName: \"kubernetes.io/projected/f741a266-b127-46cc-8304-9aedd57f07b5-kube-api-access-nwr5b\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj\" (UID: \"f741a266-b127-46cc-8304-9aedd57f07b5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.572005 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f741a266-b127-46cc-8304-9aedd57f07b5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj\" (UID: \"f741a266-b127-46cc-8304-9aedd57f07b5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.573265 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f741a266-b127-46cc-8304-9aedd57f07b5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj\" (UID: \"f741a266-b127-46cc-8304-9aedd57f07b5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.586094 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwr5b\" (UniqueName: \"kubernetes.io/projected/f741a266-b127-46cc-8304-9aedd57f07b5-kube-api-access-nwr5b\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj\" (UID: \"f741a266-b127-46cc-8304-9aedd57f07b5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" Dec 02 20:33:01 crc kubenswrapper[4807]: I1202 20:33:01.672869 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" Dec 02 20:33:02 crc kubenswrapper[4807]: I1202 20:33:02.278182 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj"] Dec 02 20:33:03 crc kubenswrapper[4807]: I1202 20:33:03.227004 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" event={"ID":"f741a266-b127-46cc-8304-9aedd57f07b5","Type":"ContainerStarted","Data":"7e32e42ad9bbbabf9410f4e85f498c3e467634d034a0c9deb033c052959a5b0a"} Dec 02 20:33:04 crc kubenswrapper[4807]: I1202 20:33:04.239873 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" event={"ID":"f741a266-b127-46cc-8304-9aedd57f07b5","Type":"ContainerStarted","Data":"15a981a47df1dec621e80030ef0ef7bb09e7820dd0153d844d4a1c262247afc4"} Dec 02 20:33:04 crc kubenswrapper[4807]: I1202 20:33:04.269545 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" podStartSLOduration=2.592420625 podStartE2EDuration="3.269518137s" podCreationTimestamp="2025-12-02 20:33:01 +0000 UTC" firstStartedPulling="2025-12-02 20:33:02.288957641 +0000 UTC m=+2117.589865146" lastFinishedPulling="2025-12-02 20:33:02.966055163 +0000 UTC m=+2118.266962658" observedRunningTime="2025-12-02 20:33:04.260839575 +0000 UTC m=+2119.561747080" watchObservedRunningTime="2025-12-02 20:33:04.269518137 +0000 UTC m=+2119.570425642" Dec 02 20:33:08 crc kubenswrapper[4807]: I1202 20:33:08.293039 4807 generic.go:334] "Generic (PLEG): container finished" podID="f741a266-b127-46cc-8304-9aedd57f07b5" containerID="15a981a47df1dec621e80030ef0ef7bb09e7820dd0153d844d4a1c262247afc4" exitCode=0 Dec 02 20:33:08 crc kubenswrapper[4807]: I1202 20:33:08.293355 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" event={"ID":"f741a266-b127-46cc-8304-9aedd57f07b5","Type":"ContainerDied","Data":"15a981a47df1dec621e80030ef0ef7bb09e7820dd0153d844d4a1c262247afc4"} Dec 02 20:33:09 crc kubenswrapper[4807]: I1202 20:33:09.830645 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" Dec 02 20:33:09 crc kubenswrapper[4807]: I1202 20:33:09.963512 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f741a266-b127-46cc-8304-9aedd57f07b5-inventory\") pod \"f741a266-b127-46cc-8304-9aedd57f07b5\" (UID: \"f741a266-b127-46cc-8304-9aedd57f07b5\") " Dec 02 20:33:09 crc kubenswrapper[4807]: I1202 20:33:09.963591 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f741a266-b127-46cc-8304-9aedd57f07b5-ssh-key\") pod \"f741a266-b127-46cc-8304-9aedd57f07b5\" (UID: \"f741a266-b127-46cc-8304-9aedd57f07b5\") " Dec 02 20:33:09 crc kubenswrapper[4807]: I1202 20:33:09.964938 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwr5b\" (UniqueName: \"kubernetes.io/projected/f741a266-b127-46cc-8304-9aedd57f07b5-kube-api-access-nwr5b\") pod \"f741a266-b127-46cc-8304-9aedd57f07b5\" (UID: \"f741a266-b127-46cc-8304-9aedd57f07b5\") " Dec 02 20:33:09 crc kubenswrapper[4807]: I1202 20:33:09.973986 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f741a266-b127-46cc-8304-9aedd57f07b5-kube-api-access-nwr5b" (OuterVolumeSpecName: "kube-api-access-nwr5b") pod "f741a266-b127-46cc-8304-9aedd57f07b5" (UID: "f741a266-b127-46cc-8304-9aedd57f07b5"). InnerVolumeSpecName "kube-api-access-nwr5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.001944 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f741a266-b127-46cc-8304-9aedd57f07b5-inventory" (OuterVolumeSpecName: "inventory") pod "f741a266-b127-46cc-8304-9aedd57f07b5" (UID: "f741a266-b127-46cc-8304-9aedd57f07b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.018194 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f741a266-b127-46cc-8304-9aedd57f07b5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f741a266-b127-46cc-8304-9aedd57f07b5" (UID: "f741a266-b127-46cc-8304-9aedd57f07b5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.068507 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f741a266-b127-46cc-8304-9aedd57f07b5-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.068619 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f741a266-b127-46cc-8304-9aedd57f07b5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.068680 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwr5b\" (UniqueName: \"kubernetes.io/projected/f741a266-b127-46cc-8304-9aedd57f07b5-kube-api-access-nwr5b\") on node \"crc\" DevicePath \"\"" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.341372 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" event={"ID":"f741a266-b127-46cc-8304-9aedd57f07b5","Type":"ContainerDied","Data":"7e32e42ad9bbbabf9410f4e85f498c3e467634d034a0c9deb033c052959a5b0a"} Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.341703 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e32e42ad9bbbabf9410f4e85f498c3e467634d034a0c9deb033c052959a5b0a" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.341449 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.402138 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt"] Dec 02 20:33:10 crc kubenswrapper[4807]: E1202 20:33:10.402664 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f741a266-b127-46cc-8304-9aedd57f07b5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.402686 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f741a266-b127-46cc-8304-9aedd57f07b5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.402965 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f741a266-b127-46cc-8304-9aedd57f07b5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.403759 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" Dec 02 20:33:10 crc kubenswrapper[4807]: E1202 20:33:10.405040 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf741a266_b127_46cc_8304_9aedd57f07b5.slice\": RecentStats: unable to find data in memory cache]" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.409170 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.409428 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.410121 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.410302 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.428399 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt"] Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.578345 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c2e1673-a8ae-401a-b874-d425c01fad63-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5lbnt\" (UID: \"0c2e1673-a8ae-401a-b874-d425c01fad63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.578691 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mjhv\" (UniqueName: \"kubernetes.io/projected/0c2e1673-a8ae-401a-b874-d425c01fad63-kube-api-access-9mjhv\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5lbnt\" (UID: \"0c2e1673-a8ae-401a-b874-d425c01fad63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.578886 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c2e1673-a8ae-401a-b874-d425c01fad63-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5lbnt\" (UID: \"0c2e1673-a8ae-401a-b874-d425c01fad63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.681348 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c2e1673-a8ae-401a-b874-d425c01fad63-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5lbnt\" (UID: \"0c2e1673-a8ae-401a-b874-d425c01fad63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.681489 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c2e1673-a8ae-401a-b874-d425c01fad63-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5lbnt\" (UID: \"0c2e1673-a8ae-401a-b874-d425c01fad63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.681552 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mjhv\" (UniqueName: \"kubernetes.io/projected/0c2e1673-a8ae-401a-b874-d425c01fad63-kube-api-access-9mjhv\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5lbnt\" (UID: \"0c2e1673-a8ae-401a-b874-d425c01fad63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.687669 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c2e1673-a8ae-401a-b874-d425c01fad63-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5lbnt\" (UID: \"0c2e1673-a8ae-401a-b874-d425c01fad63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.687869 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c2e1673-a8ae-401a-b874-d425c01fad63-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5lbnt\" (UID: \"0c2e1673-a8ae-401a-b874-d425c01fad63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.709361 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mjhv\" (UniqueName: \"kubernetes.io/projected/0c2e1673-a8ae-401a-b874-d425c01fad63-kube-api-access-9mjhv\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-5lbnt\" (UID: \"0c2e1673-a8ae-401a-b874-d425c01fad63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.726203 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" Dec 02 20:33:10 crc kubenswrapper[4807]: I1202 20:33:10.944766 4807 scope.go:117] "RemoveContainer" containerID="5e5173c578ac80bca14a7eeb0505f508f98202bb92c7a97be190a8074c7ab083" Dec 02 20:33:11 crc kubenswrapper[4807]: I1202 20:33:11.009347 4807 scope.go:117] "RemoveContainer" containerID="1a7ebd77d06c82058f56bc01c466308e693f119872c024b5d0f18cab460cfa11" Dec 02 20:33:11 crc kubenswrapper[4807]: I1202 20:33:11.070537 4807 scope.go:117] "RemoveContainer" containerID="25e93ffe61c6b7ad31ce6232b73d2ae02283586d0d18b862cabbf5184f6d06ea" Dec 02 20:33:11 crc kubenswrapper[4807]: I1202 20:33:11.088771 4807 scope.go:117] "RemoveContainer" containerID="d7fb0c43f3c59949a33ece3d7f74b9a550ef712b92367d6c8a54c44d706b554a" Dec 02 20:33:11 crc kubenswrapper[4807]: I1202 20:33:11.135359 4807 scope.go:117] "RemoveContainer" containerID="62ee5d0af02152db741266410f81a2f85b6e7440da183fb4473f51ef1e9b4011" Dec 02 20:33:11 crc kubenswrapper[4807]: I1202 20:33:11.165774 4807 scope.go:117] "RemoveContainer" containerID="313f2233436bcc17c17ac5ac71d7bd4a4ca98714cfeecb3c919ac2d292748484" Dec 02 20:33:11 crc kubenswrapper[4807]: I1202 20:33:11.200620 4807 scope.go:117] "RemoveContainer" containerID="0452659f40781e97b757d36f04cd0ff6da79adcabe9b19966bb7e19a067f516e" Dec 02 20:33:11 crc kubenswrapper[4807]: I1202 20:33:11.288219 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt"] Dec 02 20:33:11 crc kubenswrapper[4807]: I1202 20:33:11.356662 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" event={"ID":"0c2e1673-a8ae-401a-b874-d425c01fad63","Type":"ContainerStarted","Data":"359bf22fe35c70a0358fc51b5f0503be67e058eef43f735d030642b2635371ff"} Dec 02 20:33:12 crc kubenswrapper[4807]: I1202 20:33:12.373858 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" event={"ID":"0c2e1673-a8ae-401a-b874-d425c01fad63","Type":"ContainerStarted","Data":"0559a9a669598a2f048bd1634c3518e36163c0cefb1dc4fabb42c03d9817fcbb"} Dec 02 20:33:12 crc kubenswrapper[4807]: I1202 20:33:12.405554 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" podStartSLOduration=1.935616606 podStartE2EDuration="2.405534466s" podCreationTimestamp="2025-12-02 20:33:10 +0000 UTC" firstStartedPulling="2025-12-02 20:33:11.317480755 +0000 UTC m=+2126.618388250" lastFinishedPulling="2025-12-02 20:33:11.787398615 +0000 UTC m=+2127.088306110" observedRunningTime="2025-12-02 20:33:12.393879678 +0000 UTC m=+2127.694787193" watchObservedRunningTime="2025-12-02 20:33:12.405534466 +0000 UTC m=+2127.706441961" Dec 02 20:33:25 crc kubenswrapper[4807]: I1202 20:33:25.066823 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-9z4k8"] Dec 02 20:33:25 crc kubenswrapper[4807]: I1202 20:33:25.074999 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-9z4k8"] Dec 02 20:33:26 crc kubenswrapper[4807]: I1202 20:33:26.990972 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18c173d1-798c-4f98-bfe3-0251b7a19403" path="/var/lib/kubelet/pods/18c173d1-798c-4f98-bfe3-0251b7a19403/volumes" Dec 02 20:33:27 crc kubenswrapper[4807]: I1202 20:33:27.048309 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-j86hv"] Dec 02 20:33:27 crc kubenswrapper[4807]: I1202 20:33:27.057154 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-j86hv"] Dec 02 20:33:28 crc kubenswrapper[4807]: I1202 20:33:28.988874 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42be5b49-eca1-4532-8742-3dfbb8a4f910" path="/var/lib/kubelet/pods/42be5b49-eca1-4532-8742-3dfbb8a4f910/volumes" Dec 02 20:33:49 crc kubenswrapper[4807]: I1202 20:33:49.964998 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h8rpw"] Dec 02 20:33:49 crc kubenswrapper[4807]: I1202 20:33:49.967739 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:33:49 crc kubenswrapper[4807]: I1202 20:33:49.975355 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h8rpw"] Dec 02 20:33:50 crc kubenswrapper[4807]: I1202 20:33:50.056034 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ts7j\" (UniqueName: \"kubernetes.io/projected/5769cb06-4d53-48e9-91e0-39c5fb19cacc-kube-api-access-7ts7j\") pod \"community-operators-h8rpw\" (UID: \"5769cb06-4d53-48e9-91e0-39c5fb19cacc\") " pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:33:50 crc kubenswrapper[4807]: I1202 20:33:50.057270 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5769cb06-4d53-48e9-91e0-39c5fb19cacc-catalog-content\") pod \"community-operators-h8rpw\" (UID: \"5769cb06-4d53-48e9-91e0-39c5fb19cacc\") " pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:33:50 crc kubenswrapper[4807]: I1202 20:33:50.057345 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5769cb06-4d53-48e9-91e0-39c5fb19cacc-utilities\") pod \"community-operators-h8rpw\" (UID: \"5769cb06-4d53-48e9-91e0-39c5fb19cacc\") " pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:33:50 crc kubenswrapper[4807]: I1202 20:33:50.159481 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5769cb06-4d53-48e9-91e0-39c5fb19cacc-catalog-content\") pod \"community-operators-h8rpw\" (UID: \"5769cb06-4d53-48e9-91e0-39c5fb19cacc\") " pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:33:50 crc kubenswrapper[4807]: I1202 20:33:50.159546 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5769cb06-4d53-48e9-91e0-39c5fb19cacc-utilities\") pod \"community-operators-h8rpw\" (UID: \"5769cb06-4d53-48e9-91e0-39c5fb19cacc\") " pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:33:50 crc kubenswrapper[4807]: I1202 20:33:50.159658 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ts7j\" (UniqueName: \"kubernetes.io/projected/5769cb06-4d53-48e9-91e0-39c5fb19cacc-kube-api-access-7ts7j\") pod \"community-operators-h8rpw\" (UID: \"5769cb06-4d53-48e9-91e0-39c5fb19cacc\") " pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:33:50 crc kubenswrapper[4807]: I1202 20:33:50.160412 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5769cb06-4d53-48e9-91e0-39c5fb19cacc-catalog-content\") pod \"community-operators-h8rpw\" (UID: \"5769cb06-4d53-48e9-91e0-39c5fb19cacc\") " pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:33:50 crc kubenswrapper[4807]: I1202 20:33:50.160681 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5769cb06-4d53-48e9-91e0-39c5fb19cacc-utilities\") pod \"community-operators-h8rpw\" (UID: \"5769cb06-4d53-48e9-91e0-39c5fb19cacc\") " pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:33:50 crc kubenswrapper[4807]: I1202 20:33:50.177326 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ts7j\" (UniqueName: \"kubernetes.io/projected/5769cb06-4d53-48e9-91e0-39c5fb19cacc-kube-api-access-7ts7j\") pod \"community-operators-h8rpw\" (UID: \"5769cb06-4d53-48e9-91e0-39c5fb19cacc\") " pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:33:50 crc kubenswrapper[4807]: I1202 20:33:50.324941 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:33:50 crc kubenswrapper[4807]: I1202 20:33:50.841978 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h8rpw"] Dec 02 20:33:51 crc kubenswrapper[4807]: I1202 20:33:51.858469 4807 generic.go:334] "Generic (PLEG): container finished" podID="5769cb06-4d53-48e9-91e0-39c5fb19cacc" containerID="3ad0904772f539061bf05b2410feccd3e6f2c71ead3aa13553745a16b472bbbc" exitCode=0 Dec 02 20:33:51 crc kubenswrapper[4807]: I1202 20:33:51.858531 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h8rpw" event={"ID":"5769cb06-4d53-48e9-91e0-39c5fb19cacc","Type":"ContainerDied","Data":"3ad0904772f539061bf05b2410feccd3e6f2c71ead3aa13553745a16b472bbbc"} Dec 02 20:33:51 crc kubenswrapper[4807]: I1202 20:33:51.858807 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h8rpw" event={"ID":"5769cb06-4d53-48e9-91e0-39c5fb19cacc","Type":"ContainerStarted","Data":"ad1be9d0f5014fca96ac7ed3cb8c8cc3a11c27afc60c0032443610197be49452"} Dec 02 20:33:52 crc kubenswrapper[4807]: I1202 20:33:52.876765 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h8rpw" event={"ID":"5769cb06-4d53-48e9-91e0-39c5fb19cacc","Type":"ContainerStarted","Data":"45e2e120efdd5a97b492d7502a06be299eb5c9ca9c5885828602f629c00cf0eb"} Dec 02 20:33:53 crc kubenswrapper[4807]: I1202 20:33:53.894136 4807 generic.go:334] "Generic (PLEG): container finished" podID="5769cb06-4d53-48e9-91e0-39c5fb19cacc" containerID="45e2e120efdd5a97b492d7502a06be299eb5c9ca9c5885828602f629c00cf0eb" exitCode=0 Dec 02 20:33:53 crc kubenswrapper[4807]: I1202 20:33:53.894196 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h8rpw" event={"ID":"5769cb06-4d53-48e9-91e0-39c5fb19cacc","Type":"ContainerDied","Data":"45e2e120efdd5a97b492d7502a06be299eb5c9ca9c5885828602f629c00cf0eb"} Dec 02 20:33:54 crc kubenswrapper[4807]: I1202 20:33:54.904368 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h8rpw" event={"ID":"5769cb06-4d53-48e9-91e0-39c5fb19cacc","Type":"ContainerStarted","Data":"64aee7e2a6e4091a64ffaf142807277c7847a341e3e4a114db0fd70864f63f43"} Dec 02 20:33:54 crc kubenswrapper[4807]: I1202 20:33:54.921004 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h8rpw" podStartSLOduration=3.411944478 podStartE2EDuration="5.920983402s" podCreationTimestamp="2025-12-02 20:33:49 +0000 UTC" firstStartedPulling="2025-12-02 20:33:51.86108119 +0000 UTC m=+2167.161988695" lastFinishedPulling="2025-12-02 20:33:54.370120124 +0000 UTC m=+2169.671027619" observedRunningTime="2025-12-02 20:33:54.91810448 +0000 UTC m=+2170.219011985" watchObservedRunningTime="2025-12-02 20:33:54.920983402 +0000 UTC m=+2170.221890897" Dec 02 20:33:55 crc kubenswrapper[4807]: I1202 20:33:55.915034 4807 generic.go:334] "Generic (PLEG): container finished" podID="0c2e1673-a8ae-401a-b874-d425c01fad63" containerID="0559a9a669598a2f048bd1634c3518e36163c0cefb1dc4fabb42c03d9817fcbb" exitCode=0 Dec 02 20:33:55 crc kubenswrapper[4807]: I1202 20:33:55.915302 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" event={"ID":"0c2e1673-a8ae-401a-b874-d425c01fad63","Type":"ContainerDied","Data":"0559a9a669598a2f048bd1634c3518e36163c0cefb1dc4fabb42c03d9817fcbb"} Dec 02 20:33:57 crc kubenswrapper[4807]: I1202 20:33:57.360214 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" Dec 02 20:33:57 crc kubenswrapper[4807]: I1202 20:33:57.523520 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mjhv\" (UniqueName: \"kubernetes.io/projected/0c2e1673-a8ae-401a-b874-d425c01fad63-kube-api-access-9mjhv\") pod \"0c2e1673-a8ae-401a-b874-d425c01fad63\" (UID: \"0c2e1673-a8ae-401a-b874-d425c01fad63\") " Dec 02 20:33:57 crc kubenswrapper[4807]: I1202 20:33:57.523823 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c2e1673-a8ae-401a-b874-d425c01fad63-inventory\") pod \"0c2e1673-a8ae-401a-b874-d425c01fad63\" (UID: \"0c2e1673-a8ae-401a-b874-d425c01fad63\") " Dec 02 20:33:57 crc kubenswrapper[4807]: I1202 20:33:57.524590 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c2e1673-a8ae-401a-b874-d425c01fad63-ssh-key\") pod \"0c2e1673-a8ae-401a-b874-d425c01fad63\" (UID: \"0c2e1673-a8ae-401a-b874-d425c01fad63\") " Dec 02 20:33:57 crc kubenswrapper[4807]: I1202 20:33:57.529056 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c2e1673-a8ae-401a-b874-d425c01fad63-kube-api-access-9mjhv" (OuterVolumeSpecName: "kube-api-access-9mjhv") pod "0c2e1673-a8ae-401a-b874-d425c01fad63" (UID: "0c2e1673-a8ae-401a-b874-d425c01fad63"). InnerVolumeSpecName "kube-api-access-9mjhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:33:57 crc kubenswrapper[4807]: I1202 20:33:57.556734 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c2e1673-a8ae-401a-b874-d425c01fad63-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0c2e1673-a8ae-401a-b874-d425c01fad63" (UID: "0c2e1673-a8ae-401a-b874-d425c01fad63"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:33:57 crc kubenswrapper[4807]: I1202 20:33:57.561908 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c2e1673-a8ae-401a-b874-d425c01fad63-inventory" (OuterVolumeSpecName: "inventory") pod "0c2e1673-a8ae-401a-b874-d425c01fad63" (UID: "0c2e1673-a8ae-401a-b874-d425c01fad63"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:33:57 crc kubenswrapper[4807]: I1202 20:33:57.627061 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mjhv\" (UniqueName: \"kubernetes.io/projected/0c2e1673-a8ae-401a-b874-d425c01fad63-kube-api-access-9mjhv\") on node \"crc\" DevicePath \"\"" Dec 02 20:33:57 crc kubenswrapper[4807]: I1202 20:33:57.627115 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c2e1673-a8ae-401a-b874-d425c01fad63-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 20:33:57 crc kubenswrapper[4807]: I1202 20:33:57.627133 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c2e1673-a8ae-401a-b874-d425c01fad63-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:33:57 crc kubenswrapper[4807]: I1202 20:33:57.942534 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" event={"ID":"0c2e1673-a8ae-401a-b874-d425c01fad63","Type":"ContainerDied","Data":"359bf22fe35c70a0358fc51b5f0503be67e058eef43f735d030642b2635371ff"} Dec 02 20:33:57 crc kubenswrapper[4807]: I1202 20:33:57.942598 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="359bf22fe35c70a0358fc51b5f0503be67e058eef43f735d030642b2635371ff" Dec 02 20:33:57 crc kubenswrapper[4807]: I1202 20:33:57.942636 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-5lbnt" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.055679 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249"] Dec 02 20:33:58 crc kubenswrapper[4807]: E1202 20:33:58.057109 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2e1673-a8ae-401a-b874-d425c01fad63" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.057239 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2e1673-a8ae-401a-b874-d425c01fad63" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.057653 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c2e1673-a8ae-401a-b874-d425c01fad63" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.058989 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.061895 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.062090 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.062232 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.062363 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.065920 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249"] Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.139694 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dw249\" (UID: \"d72aa681-f5b3-4192-aa10-a4b6fc8519b9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.139820 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lnrt\" (UniqueName: \"kubernetes.io/projected/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-kube-api-access-2lnrt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dw249\" (UID: \"d72aa681-f5b3-4192-aa10-a4b6fc8519b9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.139868 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dw249\" (UID: \"d72aa681-f5b3-4192-aa10-a4b6fc8519b9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.241018 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dw249\" (UID: \"d72aa681-f5b3-4192-aa10-a4b6fc8519b9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.241105 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lnrt\" (UniqueName: \"kubernetes.io/projected/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-kube-api-access-2lnrt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dw249\" (UID: \"d72aa681-f5b3-4192-aa10-a4b6fc8519b9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.241152 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dw249\" (UID: \"d72aa681-f5b3-4192-aa10-a4b6fc8519b9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.246546 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dw249\" (UID: \"d72aa681-f5b3-4192-aa10-a4b6fc8519b9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.253485 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dw249\" (UID: \"d72aa681-f5b3-4192-aa10-a4b6fc8519b9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.261674 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lnrt\" (UniqueName: \"kubernetes.io/projected/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-kube-api-access-2lnrt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dw249\" (UID: \"d72aa681-f5b3-4192-aa10-a4b6fc8519b9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" Dec 02 20:33:58 crc kubenswrapper[4807]: I1202 20:33:58.385660 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" Dec 02 20:33:59 crc kubenswrapper[4807]: W1202 20:33:59.032919 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd72aa681_f5b3_4192_aa10_a4b6fc8519b9.slice/crio-f05b23953e93de6d27b4c7ad04486600e0be2f49b83fb32568e4fc1d8a61fc3c WatchSource:0}: Error finding container f05b23953e93de6d27b4c7ad04486600e0be2f49b83fb32568e4fc1d8a61fc3c: Status 404 returned error can't find the container with id f05b23953e93de6d27b4c7ad04486600e0be2f49b83fb32568e4fc1d8a61fc3c Dec 02 20:33:59 crc kubenswrapper[4807]: I1202 20:33:59.052188 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249"] Dec 02 20:33:59 crc kubenswrapper[4807]: I1202 20:33:59.974153 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" event={"ID":"d72aa681-f5b3-4192-aa10-a4b6fc8519b9","Type":"ContainerStarted","Data":"f05b23953e93de6d27b4c7ad04486600e0be2f49b83fb32568e4fc1d8a61fc3c"} Dec 02 20:34:00 crc kubenswrapper[4807]: I1202 20:34:00.325865 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:34:00 crc kubenswrapper[4807]: I1202 20:34:00.326188 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:34:00 crc kubenswrapper[4807]: I1202 20:34:00.386440 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:34:00 crc kubenswrapper[4807]: I1202 20:34:00.992915 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" event={"ID":"d72aa681-f5b3-4192-aa10-a4b6fc8519b9","Type":"ContainerStarted","Data":"2af9af8f2c59d1ced12f22070d5da3d74581742b7b0d00e19fc95647d909a7d5"} Dec 02 20:34:01 crc kubenswrapper[4807]: I1202 20:34:01.027445 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" podStartSLOduration=2.104258595 podStartE2EDuration="3.027424153s" podCreationTimestamp="2025-12-02 20:33:58 +0000 UTC" firstStartedPulling="2025-12-02 20:33:59.042657667 +0000 UTC m=+2174.343565162" lastFinishedPulling="2025-12-02 20:33:59.965823225 +0000 UTC m=+2175.266730720" observedRunningTime="2025-12-02 20:34:01.009791038 +0000 UTC m=+2176.310698543" watchObservedRunningTime="2025-12-02 20:34:01.027424153 +0000 UTC m=+2176.328331648" Dec 02 20:34:01 crc kubenswrapper[4807]: I1202 20:34:01.065181 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:34:01 crc kubenswrapper[4807]: I1202 20:34:01.138344 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h8rpw"] Dec 02 20:34:03 crc kubenswrapper[4807]: I1202 20:34:03.011770 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h8rpw" podUID="5769cb06-4d53-48e9-91e0-39c5fb19cacc" containerName="registry-server" containerID="cri-o://64aee7e2a6e4091a64ffaf142807277c7847a341e3e4a114db0fd70864f63f43" gracePeriod=2 Dec 02 20:34:03 crc kubenswrapper[4807]: I1202 20:34:03.613439 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:34:03 crc kubenswrapper[4807]: I1202 20:34:03.692237 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5769cb06-4d53-48e9-91e0-39c5fb19cacc-utilities\") pod \"5769cb06-4d53-48e9-91e0-39c5fb19cacc\" (UID: \"5769cb06-4d53-48e9-91e0-39c5fb19cacc\") " Dec 02 20:34:03 crc kubenswrapper[4807]: I1202 20:34:03.692327 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ts7j\" (UniqueName: \"kubernetes.io/projected/5769cb06-4d53-48e9-91e0-39c5fb19cacc-kube-api-access-7ts7j\") pod \"5769cb06-4d53-48e9-91e0-39c5fb19cacc\" (UID: \"5769cb06-4d53-48e9-91e0-39c5fb19cacc\") " Dec 02 20:34:03 crc kubenswrapper[4807]: I1202 20:34:03.692603 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5769cb06-4d53-48e9-91e0-39c5fb19cacc-catalog-content\") pod \"5769cb06-4d53-48e9-91e0-39c5fb19cacc\" (UID: \"5769cb06-4d53-48e9-91e0-39c5fb19cacc\") " Dec 02 20:34:03 crc kubenswrapper[4807]: I1202 20:34:03.693606 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5769cb06-4d53-48e9-91e0-39c5fb19cacc-utilities" (OuterVolumeSpecName: "utilities") pod "5769cb06-4d53-48e9-91e0-39c5fb19cacc" (UID: "5769cb06-4d53-48e9-91e0-39c5fb19cacc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:34:03 crc kubenswrapper[4807]: I1202 20:34:03.707954 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5769cb06-4d53-48e9-91e0-39c5fb19cacc-kube-api-access-7ts7j" (OuterVolumeSpecName: "kube-api-access-7ts7j") pod "5769cb06-4d53-48e9-91e0-39c5fb19cacc" (UID: "5769cb06-4d53-48e9-91e0-39c5fb19cacc"). InnerVolumeSpecName "kube-api-access-7ts7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:34:03 crc kubenswrapper[4807]: I1202 20:34:03.744822 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5769cb06-4d53-48e9-91e0-39c5fb19cacc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5769cb06-4d53-48e9-91e0-39c5fb19cacc" (UID: "5769cb06-4d53-48e9-91e0-39c5fb19cacc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:34:03 crc kubenswrapper[4807]: I1202 20:34:03.795815 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5769cb06-4d53-48e9-91e0-39c5fb19cacc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:34:03 crc kubenswrapper[4807]: I1202 20:34:03.795862 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5769cb06-4d53-48e9-91e0-39c5fb19cacc-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:34:03 crc kubenswrapper[4807]: I1202 20:34:03.795883 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ts7j\" (UniqueName: \"kubernetes.io/projected/5769cb06-4d53-48e9-91e0-39c5fb19cacc-kube-api-access-7ts7j\") on node \"crc\" DevicePath \"\"" Dec 02 20:34:04 crc kubenswrapper[4807]: I1202 20:34:04.027117 4807 generic.go:334] "Generic (PLEG): container finished" podID="5769cb06-4d53-48e9-91e0-39c5fb19cacc" containerID="64aee7e2a6e4091a64ffaf142807277c7847a341e3e4a114db0fd70864f63f43" exitCode=0 Dec 02 20:34:04 crc kubenswrapper[4807]: I1202 20:34:04.027167 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h8rpw" event={"ID":"5769cb06-4d53-48e9-91e0-39c5fb19cacc","Type":"ContainerDied","Data":"64aee7e2a6e4091a64ffaf142807277c7847a341e3e4a114db0fd70864f63f43"} Dec 02 20:34:04 crc kubenswrapper[4807]: I1202 20:34:04.027199 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h8rpw" event={"ID":"5769cb06-4d53-48e9-91e0-39c5fb19cacc","Type":"ContainerDied","Data":"ad1be9d0f5014fca96ac7ed3cb8c8cc3a11c27afc60c0032443610197be49452"} Dec 02 20:34:04 crc kubenswrapper[4807]: I1202 20:34:04.027221 4807 scope.go:117] "RemoveContainer" containerID="64aee7e2a6e4091a64ffaf142807277c7847a341e3e4a114db0fd70864f63f43" Dec 02 20:34:04 crc kubenswrapper[4807]: I1202 20:34:04.027385 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h8rpw" Dec 02 20:34:04 crc kubenswrapper[4807]: I1202 20:34:04.080027 4807 scope.go:117] "RemoveContainer" containerID="45e2e120efdd5a97b492d7502a06be299eb5c9ca9c5885828602f629c00cf0eb" Dec 02 20:34:04 crc kubenswrapper[4807]: I1202 20:34:04.086481 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h8rpw"] Dec 02 20:34:04 crc kubenswrapper[4807]: I1202 20:34:04.097930 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h8rpw"] Dec 02 20:34:04 crc kubenswrapper[4807]: I1202 20:34:04.119879 4807 scope.go:117] "RemoveContainer" containerID="3ad0904772f539061bf05b2410feccd3e6f2c71ead3aa13553745a16b472bbbc" Dec 02 20:34:04 crc kubenswrapper[4807]: I1202 20:34:04.144895 4807 scope.go:117] "RemoveContainer" containerID="64aee7e2a6e4091a64ffaf142807277c7847a341e3e4a114db0fd70864f63f43" Dec 02 20:34:04 crc kubenswrapper[4807]: E1202 20:34:04.145392 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64aee7e2a6e4091a64ffaf142807277c7847a341e3e4a114db0fd70864f63f43\": container with ID starting with 64aee7e2a6e4091a64ffaf142807277c7847a341e3e4a114db0fd70864f63f43 not found: ID does not exist" containerID="64aee7e2a6e4091a64ffaf142807277c7847a341e3e4a114db0fd70864f63f43" Dec 02 20:34:04 crc kubenswrapper[4807]: I1202 20:34:04.145444 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64aee7e2a6e4091a64ffaf142807277c7847a341e3e4a114db0fd70864f63f43"} err="failed to get container status \"64aee7e2a6e4091a64ffaf142807277c7847a341e3e4a114db0fd70864f63f43\": rpc error: code = NotFound desc = could not find container \"64aee7e2a6e4091a64ffaf142807277c7847a341e3e4a114db0fd70864f63f43\": container with ID starting with 64aee7e2a6e4091a64ffaf142807277c7847a341e3e4a114db0fd70864f63f43 not found: ID does not exist" Dec 02 20:34:04 crc kubenswrapper[4807]: I1202 20:34:04.145478 4807 scope.go:117] "RemoveContainer" containerID="45e2e120efdd5a97b492d7502a06be299eb5c9ca9c5885828602f629c00cf0eb" Dec 02 20:34:04 crc kubenswrapper[4807]: E1202 20:34:04.145860 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45e2e120efdd5a97b492d7502a06be299eb5c9ca9c5885828602f629c00cf0eb\": container with ID starting with 45e2e120efdd5a97b492d7502a06be299eb5c9ca9c5885828602f629c00cf0eb not found: ID does not exist" containerID="45e2e120efdd5a97b492d7502a06be299eb5c9ca9c5885828602f629c00cf0eb" Dec 02 20:34:04 crc kubenswrapper[4807]: I1202 20:34:04.145916 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45e2e120efdd5a97b492d7502a06be299eb5c9ca9c5885828602f629c00cf0eb"} err="failed to get container status \"45e2e120efdd5a97b492d7502a06be299eb5c9ca9c5885828602f629c00cf0eb\": rpc error: code = NotFound desc = could not find container \"45e2e120efdd5a97b492d7502a06be299eb5c9ca9c5885828602f629c00cf0eb\": container with ID starting with 45e2e120efdd5a97b492d7502a06be299eb5c9ca9c5885828602f629c00cf0eb not found: ID does not exist" Dec 02 20:34:04 crc kubenswrapper[4807]: I1202 20:34:04.145955 4807 scope.go:117] "RemoveContainer" containerID="3ad0904772f539061bf05b2410feccd3e6f2c71ead3aa13553745a16b472bbbc" Dec 02 20:34:04 crc kubenswrapper[4807]: E1202 20:34:04.146295 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ad0904772f539061bf05b2410feccd3e6f2c71ead3aa13553745a16b472bbbc\": container with ID starting with 3ad0904772f539061bf05b2410feccd3e6f2c71ead3aa13553745a16b472bbbc not found: ID does not exist" containerID="3ad0904772f539061bf05b2410feccd3e6f2c71ead3aa13553745a16b472bbbc" Dec 02 20:34:04 crc kubenswrapper[4807]: I1202 20:34:04.146326 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ad0904772f539061bf05b2410feccd3e6f2c71ead3aa13553745a16b472bbbc"} err="failed to get container status \"3ad0904772f539061bf05b2410feccd3e6f2c71ead3aa13553745a16b472bbbc\": rpc error: code = NotFound desc = could not find container \"3ad0904772f539061bf05b2410feccd3e6f2c71ead3aa13553745a16b472bbbc\": container with ID starting with 3ad0904772f539061bf05b2410feccd3e6f2c71ead3aa13553745a16b472bbbc not found: ID does not exist" Dec 02 20:34:04 crc kubenswrapper[4807]: I1202 20:34:04.994318 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5769cb06-4d53-48e9-91e0-39c5fb19cacc" path="/var/lib/kubelet/pods/5769cb06-4d53-48e9-91e0-39c5fb19cacc/volumes" Dec 02 20:34:09 crc kubenswrapper[4807]: I1202 20:34:09.065777 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-fzzjd"] Dec 02 20:34:09 crc kubenswrapper[4807]: I1202 20:34:09.074694 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-fzzjd"] Dec 02 20:34:10 crc kubenswrapper[4807]: I1202 20:34:10.991445 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd0c1a2e-9558-4538-8e36-aa4def438cc1" path="/var/lib/kubelet/pods/dd0c1a2e-9558-4538-8e36-aa4def438cc1/volumes" Dec 02 20:34:11 crc kubenswrapper[4807]: I1202 20:34:11.379855 4807 scope.go:117] "RemoveContainer" containerID="33ecbbd48426c71757455798136c098c550adb765a2a431477989d38e70ad868" Dec 02 20:34:11 crc kubenswrapper[4807]: I1202 20:34:11.445548 4807 scope.go:117] "RemoveContainer" containerID="39bd67727ba39f200c809ece4e6154872c2dacdc584517234a8e1e51448f65a6" Dec 02 20:34:11 crc kubenswrapper[4807]: I1202 20:34:11.506685 4807 scope.go:117] "RemoveContainer" containerID="d8df39f6a448db0c81318fae0d805a9afcfff9aaedf924c7d66119bc0a63d1a9" Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.347630 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q7nnv"] Dec 02 20:34:44 crc kubenswrapper[4807]: E1202 20:34:44.352524 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5769cb06-4d53-48e9-91e0-39c5fb19cacc" containerName="extract-utilities" Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.352704 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5769cb06-4d53-48e9-91e0-39c5fb19cacc" containerName="extract-utilities" Dec 02 20:34:44 crc kubenswrapper[4807]: E1202 20:34:44.352948 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5769cb06-4d53-48e9-91e0-39c5fb19cacc" containerName="registry-server" Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.353118 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5769cb06-4d53-48e9-91e0-39c5fb19cacc" containerName="registry-server" Dec 02 20:34:44 crc kubenswrapper[4807]: E1202 20:34:44.353272 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5769cb06-4d53-48e9-91e0-39c5fb19cacc" containerName="extract-content" Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.353410 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5769cb06-4d53-48e9-91e0-39c5fb19cacc" containerName="extract-content" Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.354074 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="5769cb06-4d53-48e9-91e0-39c5fb19cacc" containerName="registry-server" Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.357730 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.375760 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7nnv"] Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.451389 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21d1b51-7b35-4679-994f-c788827531da-utilities\") pod \"redhat-marketplace-q7nnv\" (UID: \"c21d1b51-7b35-4679-994f-c788827531da\") " pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.451567 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21d1b51-7b35-4679-994f-c788827531da-catalog-content\") pod \"redhat-marketplace-q7nnv\" (UID: \"c21d1b51-7b35-4679-994f-c788827531da\") " pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.451597 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lndsw\" (UniqueName: \"kubernetes.io/projected/c21d1b51-7b35-4679-994f-c788827531da-kube-api-access-lndsw\") pod \"redhat-marketplace-q7nnv\" (UID: \"c21d1b51-7b35-4679-994f-c788827531da\") " pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.553364 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21d1b51-7b35-4679-994f-c788827531da-catalog-content\") pod \"redhat-marketplace-q7nnv\" (UID: \"c21d1b51-7b35-4679-994f-c788827531da\") " pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.553673 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lndsw\" (UniqueName: \"kubernetes.io/projected/c21d1b51-7b35-4679-994f-c788827531da-kube-api-access-lndsw\") pod \"redhat-marketplace-q7nnv\" (UID: \"c21d1b51-7b35-4679-994f-c788827531da\") " pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.553845 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21d1b51-7b35-4679-994f-c788827531da-utilities\") pod \"redhat-marketplace-q7nnv\" (UID: \"c21d1b51-7b35-4679-994f-c788827531da\") " pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.554058 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21d1b51-7b35-4679-994f-c788827531da-catalog-content\") pod \"redhat-marketplace-q7nnv\" (UID: \"c21d1b51-7b35-4679-994f-c788827531da\") " pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.554527 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21d1b51-7b35-4679-994f-c788827531da-utilities\") pod \"redhat-marketplace-q7nnv\" (UID: \"c21d1b51-7b35-4679-994f-c788827531da\") " pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.587446 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lndsw\" (UniqueName: \"kubernetes.io/projected/c21d1b51-7b35-4679-994f-c788827531da-kube-api-access-lndsw\") pod \"redhat-marketplace-q7nnv\" (UID: \"c21d1b51-7b35-4679-994f-c788827531da\") " pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:44 crc kubenswrapper[4807]: I1202 20:34:44.686607 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:45 crc kubenswrapper[4807]: I1202 20:34:45.171413 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7nnv"] Dec 02 20:34:45 crc kubenswrapper[4807]: W1202 20:34:45.180374 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc21d1b51_7b35_4679_994f_c788827531da.slice/crio-cf2f2d69b2e7fd53b38dd656fa2bd3caebd5f1d271eda89357a23f61f05a263d WatchSource:0}: Error finding container cf2f2d69b2e7fd53b38dd656fa2bd3caebd5f1d271eda89357a23f61f05a263d: Status 404 returned error can't find the container with id cf2f2d69b2e7fd53b38dd656fa2bd3caebd5f1d271eda89357a23f61f05a263d Dec 02 20:34:45 crc kubenswrapper[4807]: I1202 20:34:45.891640 4807 generic.go:334] "Generic (PLEG): container finished" podID="c21d1b51-7b35-4679-994f-c788827531da" containerID="b91d0f2987047769717f77cfc1975d7d0e0e5c6b381694a09d5e85fc62997287" exitCode=0 Dec 02 20:34:45 crc kubenswrapper[4807]: I1202 20:34:45.891932 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7nnv" event={"ID":"c21d1b51-7b35-4679-994f-c788827531da","Type":"ContainerDied","Data":"b91d0f2987047769717f77cfc1975d7d0e0e5c6b381694a09d5e85fc62997287"} Dec 02 20:34:45 crc kubenswrapper[4807]: I1202 20:34:45.892001 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7nnv" event={"ID":"c21d1b51-7b35-4679-994f-c788827531da","Type":"ContainerStarted","Data":"cf2f2d69b2e7fd53b38dd656fa2bd3caebd5f1d271eda89357a23f61f05a263d"} Dec 02 20:34:46 crc kubenswrapper[4807]: I1202 20:34:46.908559 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7nnv" event={"ID":"c21d1b51-7b35-4679-994f-c788827531da","Type":"ContainerStarted","Data":"b789283f8dc1c12fe13c23cab1f32565acc6f413b4932a6ef16c993a44484ea5"} Dec 02 20:34:47 crc kubenswrapper[4807]: I1202 20:34:47.939204 4807 generic.go:334] "Generic (PLEG): container finished" podID="c21d1b51-7b35-4679-994f-c788827531da" containerID="b789283f8dc1c12fe13c23cab1f32565acc6f413b4932a6ef16c993a44484ea5" exitCode=0 Dec 02 20:34:47 crc kubenswrapper[4807]: I1202 20:34:47.939277 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7nnv" event={"ID":"c21d1b51-7b35-4679-994f-c788827531da","Type":"ContainerDied","Data":"b789283f8dc1c12fe13c23cab1f32565acc6f413b4932a6ef16c993a44484ea5"} Dec 02 20:34:48 crc kubenswrapper[4807]: I1202 20:34:48.953228 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7nnv" event={"ID":"c21d1b51-7b35-4679-994f-c788827531da","Type":"ContainerStarted","Data":"349940ea08cf7083774d23c5823db767c23dcc1886d1bdba651beec4ab6670c2"} Dec 02 20:34:48 crc kubenswrapper[4807]: I1202 20:34:48.984665 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q7nnv" podStartSLOduration=2.2913415759999998 podStartE2EDuration="4.98462357s" podCreationTimestamp="2025-12-02 20:34:44 +0000 UTC" firstStartedPulling="2025-12-02 20:34:45.89441941 +0000 UTC m=+2221.195326905" lastFinishedPulling="2025-12-02 20:34:48.587701404 +0000 UTC m=+2223.888608899" observedRunningTime="2025-12-02 20:34:48.978942938 +0000 UTC m=+2224.279850443" watchObservedRunningTime="2025-12-02 20:34:48.98462357 +0000 UTC m=+2224.285531065" Dec 02 20:34:54 crc kubenswrapper[4807]: I1202 20:34:54.686747 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:54 crc kubenswrapper[4807]: I1202 20:34:54.687210 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:54 crc kubenswrapper[4807]: I1202 20:34:54.752684 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:55 crc kubenswrapper[4807]: I1202 20:34:55.092449 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:55 crc kubenswrapper[4807]: I1202 20:34:55.149555 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7nnv"] Dec 02 20:34:57 crc kubenswrapper[4807]: I1202 20:34:57.068454 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q7nnv" podUID="c21d1b51-7b35-4679-994f-c788827531da" containerName="registry-server" containerID="cri-o://349940ea08cf7083774d23c5823db767c23dcc1886d1bdba651beec4ab6670c2" gracePeriod=2 Dec 02 20:34:57 crc kubenswrapper[4807]: I1202 20:34:57.620850 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:57 crc kubenswrapper[4807]: I1202 20:34:57.672522 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21d1b51-7b35-4679-994f-c788827531da-utilities\") pod \"c21d1b51-7b35-4679-994f-c788827531da\" (UID: \"c21d1b51-7b35-4679-994f-c788827531da\") " Dec 02 20:34:57 crc kubenswrapper[4807]: I1202 20:34:57.672872 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lndsw\" (UniqueName: \"kubernetes.io/projected/c21d1b51-7b35-4679-994f-c788827531da-kube-api-access-lndsw\") pod \"c21d1b51-7b35-4679-994f-c788827531da\" (UID: \"c21d1b51-7b35-4679-994f-c788827531da\") " Dec 02 20:34:57 crc kubenswrapper[4807]: I1202 20:34:57.672945 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21d1b51-7b35-4679-994f-c788827531da-catalog-content\") pod \"c21d1b51-7b35-4679-994f-c788827531da\" (UID: \"c21d1b51-7b35-4679-994f-c788827531da\") " Dec 02 20:34:57 crc kubenswrapper[4807]: I1202 20:34:57.674074 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c21d1b51-7b35-4679-994f-c788827531da-utilities" (OuterVolumeSpecName: "utilities") pod "c21d1b51-7b35-4679-994f-c788827531da" (UID: "c21d1b51-7b35-4679-994f-c788827531da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:34:57 crc kubenswrapper[4807]: I1202 20:34:57.682075 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c21d1b51-7b35-4679-994f-c788827531da-kube-api-access-lndsw" (OuterVolumeSpecName: "kube-api-access-lndsw") pod "c21d1b51-7b35-4679-994f-c788827531da" (UID: "c21d1b51-7b35-4679-994f-c788827531da"). InnerVolumeSpecName "kube-api-access-lndsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:34:57 crc kubenswrapper[4807]: I1202 20:34:57.710143 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c21d1b51-7b35-4679-994f-c788827531da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c21d1b51-7b35-4679-994f-c788827531da" (UID: "c21d1b51-7b35-4679-994f-c788827531da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:34:57 crc kubenswrapper[4807]: I1202 20:34:57.776043 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lndsw\" (UniqueName: \"kubernetes.io/projected/c21d1b51-7b35-4679-994f-c788827531da-kube-api-access-lndsw\") on node \"crc\" DevicePath \"\"" Dec 02 20:34:57 crc kubenswrapper[4807]: I1202 20:34:57.776423 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21d1b51-7b35-4679-994f-c788827531da-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:34:57 crc kubenswrapper[4807]: I1202 20:34:57.776442 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21d1b51-7b35-4679-994f-c788827531da-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.082222 4807 generic.go:334] "Generic (PLEG): container finished" podID="c21d1b51-7b35-4679-994f-c788827531da" containerID="349940ea08cf7083774d23c5823db767c23dcc1886d1bdba651beec4ab6670c2" exitCode=0 Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.082290 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7nnv" event={"ID":"c21d1b51-7b35-4679-994f-c788827531da","Type":"ContainerDied","Data":"349940ea08cf7083774d23c5823db767c23dcc1886d1bdba651beec4ab6670c2"} Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.082329 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q7nnv" event={"ID":"c21d1b51-7b35-4679-994f-c788827531da","Type":"ContainerDied","Data":"cf2f2d69b2e7fd53b38dd656fa2bd3caebd5f1d271eda89357a23f61f05a263d"} Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.082349 4807 scope.go:117] "RemoveContainer" containerID="349940ea08cf7083774d23c5823db767c23dcc1886d1bdba651beec4ab6670c2" Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.082497 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q7nnv" Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.121072 4807 scope.go:117] "RemoveContainer" containerID="b789283f8dc1c12fe13c23cab1f32565acc6f413b4932a6ef16c993a44484ea5" Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.121627 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7nnv"] Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.131658 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q7nnv"] Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.147679 4807 scope.go:117] "RemoveContainer" containerID="b91d0f2987047769717f77cfc1975d7d0e0e5c6b381694a09d5e85fc62997287" Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.197910 4807 scope.go:117] "RemoveContainer" containerID="349940ea08cf7083774d23c5823db767c23dcc1886d1bdba651beec4ab6670c2" Dec 02 20:34:58 crc kubenswrapper[4807]: E1202 20:34:58.198531 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"349940ea08cf7083774d23c5823db767c23dcc1886d1bdba651beec4ab6670c2\": container with ID starting with 349940ea08cf7083774d23c5823db767c23dcc1886d1bdba651beec4ab6670c2 not found: ID does not exist" containerID="349940ea08cf7083774d23c5823db767c23dcc1886d1bdba651beec4ab6670c2" Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.198582 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"349940ea08cf7083774d23c5823db767c23dcc1886d1bdba651beec4ab6670c2"} err="failed to get container status \"349940ea08cf7083774d23c5823db767c23dcc1886d1bdba651beec4ab6670c2\": rpc error: code = NotFound desc = could not find container \"349940ea08cf7083774d23c5823db767c23dcc1886d1bdba651beec4ab6670c2\": container with ID starting with 349940ea08cf7083774d23c5823db767c23dcc1886d1bdba651beec4ab6670c2 not found: ID does not exist" Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.198613 4807 scope.go:117] "RemoveContainer" containerID="b789283f8dc1c12fe13c23cab1f32565acc6f413b4932a6ef16c993a44484ea5" Dec 02 20:34:58 crc kubenswrapper[4807]: E1202 20:34:58.198900 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b789283f8dc1c12fe13c23cab1f32565acc6f413b4932a6ef16c993a44484ea5\": container with ID starting with b789283f8dc1c12fe13c23cab1f32565acc6f413b4932a6ef16c993a44484ea5 not found: ID does not exist" containerID="b789283f8dc1c12fe13c23cab1f32565acc6f413b4932a6ef16c993a44484ea5" Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.198940 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b789283f8dc1c12fe13c23cab1f32565acc6f413b4932a6ef16c993a44484ea5"} err="failed to get container status \"b789283f8dc1c12fe13c23cab1f32565acc6f413b4932a6ef16c993a44484ea5\": rpc error: code = NotFound desc = could not find container \"b789283f8dc1c12fe13c23cab1f32565acc6f413b4932a6ef16c993a44484ea5\": container with ID starting with b789283f8dc1c12fe13c23cab1f32565acc6f413b4932a6ef16c993a44484ea5 not found: ID does not exist" Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.198999 4807 scope.go:117] "RemoveContainer" containerID="b91d0f2987047769717f77cfc1975d7d0e0e5c6b381694a09d5e85fc62997287" Dec 02 20:34:58 crc kubenswrapper[4807]: E1202 20:34:58.199249 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b91d0f2987047769717f77cfc1975d7d0e0e5c6b381694a09d5e85fc62997287\": container with ID starting with b91d0f2987047769717f77cfc1975d7d0e0e5c6b381694a09d5e85fc62997287 not found: ID does not exist" containerID="b91d0f2987047769717f77cfc1975d7d0e0e5c6b381694a09d5e85fc62997287" Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.199269 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b91d0f2987047769717f77cfc1975d7d0e0e5c6b381694a09d5e85fc62997287"} err="failed to get container status \"b91d0f2987047769717f77cfc1975d7d0e0e5c6b381694a09d5e85fc62997287\": rpc error: code = NotFound desc = could not find container \"b91d0f2987047769717f77cfc1975d7d0e0e5c6b381694a09d5e85fc62997287\": container with ID starting with b91d0f2987047769717f77cfc1975d7d0e0e5c6b381694a09d5e85fc62997287 not found: ID does not exist" Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.293267 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.293363 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:34:58 crc kubenswrapper[4807]: I1202 20:34:58.997610 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c21d1b51-7b35-4679-994f-c788827531da" path="/var/lib/kubelet/pods/c21d1b51-7b35-4679-994f-c788827531da/volumes" Dec 02 20:35:04 crc kubenswrapper[4807]: I1202 20:35:04.165368 4807 generic.go:334] "Generic (PLEG): container finished" podID="d72aa681-f5b3-4192-aa10-a4b6fc8519b9" containerID="2af9af8f2c59d1ced12f22070d5da3d74581742b7b0d00e19fc95647d909a7d5" exitCode=0 Dec 02 20:35:04 crc kubenswrapper[4807]: I1202 20:35:04.165473 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" event={"ID":"d72aa681-f5b3-4192-aa10-a4b6fc8519b9","Type":"ContainerDied","Data":"2af9af8f2c59d1ced12f22070d5da3d74581742b7b0d00e19fc95647d909a7d5"} Dec 02 20:35:05 crc kubenswrapper[4807]: I1202 20:35:05.762101 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" Dec 02 20:35:05 crc kubenswrapper[4807]: I1202 20:35:05.867550 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-ssh-key\") pod \"d72aa681-f5b3-4192-aa10-a4b6fc8519b9\" (UID: \"d72aa681-f5b3-4192-aa10-a4b6fc8519b9\") " Dec 02 20:35:05 crc kubenswrapper[4807]: I1202 20:35:05.867893 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-inventory\") pod \"d72aa681-f5b3-4192-aa10-a4b6fc8519b9\" (UID: \"d72aa681-f5b3-4192-aa10-a4b6fc8519b9\") " Dec 02 20:35:05 crc kubenswrapper[4807]: I1202 20:35:05.868073 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lnrt\" (UniqueName: \"kubernetes.io/projected/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-kube-api-access-2lnrt\") pod \"d72aa681-f5b3-4192-aa10-a4b6fc8519b9\" (UID: \"d72aa681-f5b3-4192-aa10-a4b6fc8519b9\") " Dec 02 20:35:05 crc kubenswrapper[4807]: I1202 20:35:05.874396 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-kube-api-access-2lnrt" (OuterVolumeSpecName: "kube-api-access-2lnrt") pod "d72aa681-f5b3-4192-aa10-a4b6fc8519b9" (UID: "d72aa681-f5b3-4192-aa10-a4b6fc8519b9"). InnerVolumeSpecName "kube-api-access-2lnrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:35:05 crc kubenswrapper[4807]: I1202 20:35:05.906217 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d72aa681-f5b3-4192-aa10-a4b6fc8519b9" (UID: "d72aa681-f5b3-4192-aa10-a4b6fc8519b9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:35:05 crc kubenswrapper[4807]: I1202 20:35:05.910649 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-inventory" (OuterVolumeSpecName: "inventory") pod "d72aa681-f5b3-4192-aa10-a4b6fc8519b9" (UID: "d72aa681-f5b3-4192-aa10-a4b6fc8519b9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:35:05 crc kubenswrapper[4807]: I1202 20:35:05.971807 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 20:35:05 crc kubenswrapper[4807]: I1202 20:35:05.971861 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lnrt\" (UniqueName: \"kubernetes.io/projected/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-kube-api-access-2lnrt\") on node \"crc\" DevicePath \"\"" Dec 02 20:35:05 crc kubenswrapper[4807]: I1202 20:35:05.971883 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d72aa681-f5b3-4192-aa10-a4b6fc8519b9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.192951 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" event={"ID":"d72aa681-f5b3-4192-aa10-a4b6fc8519b9","Type":"ContainerDied","Data":"f05b23953e93de6d27b4c7ad04486600e0be2f49b83fb32568e4fc1d8a61fc3c"} Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.193019 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f05b23953e93de6d27b4c7ad04486600e0be2f49b83fb32568e4fc1d8a61fc3c" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.193041 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dw249" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.336815 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8d2sc"] Dec 02 20:35:06 crc kubenswrapper[4807]: E1202 20:35:06.337532 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d72aa681-f5b3-4192-aa10-a4b6fc8519b9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.337573 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d72aa681-f5b3-4192-aa10-a4b6fc8519b9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 20:35:06 crc kubenswrapper[4807]: E1202 20:35:06.337613 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21d1b51-7b35-4679-994f-c788827531da" containerName="extract-content" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.337626 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21d1b51-7b35-4679-994f-c788827531da" containerName="extract-content" Dec 02 20:35:06 crc kubenswrapper[4807]: E1202 20:35:06.337656 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21d1b51-7b35-4679-994f-c788827531da" containerName="registry-server" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.337684 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21d1b51-7b35-4679-994f-c788827531da" containerName="registry-server" Dec 02 20:35:06 crc kubenswrapper[4807]: E1202 20:35:06.337713 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21d1b51-7b35-4679-994f-c788827531da" containerName="extract-utilities" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.337753 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21d1b51-7b35-4679-994f-c788827531da" containerName="extract-utilities" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.338123 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d72aa681-f5b3-4192-aa10-a4b6fc8519b9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.338207 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="c21d1b51-7b35-4679-994f-c788827531da" containerName="registry-server" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.339351 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.344103 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.344917 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.344942 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.345096 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.363096 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8d2sc"] Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.381443 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf9tq\" (UniqueName: \"kubernetes.io/projected/d0bb0a20-106f-412e-8ba3-b218bacdadf5-kube-api-access-jf9tq\") pod \"ssh-known-hosts-edpm-deployment-8d2sc\" (UID: \"d0bb0a20-106f-412e-8ba3-b218bacdadf5\") " pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.381573 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0bb0a20-106f-412e-8ba3-b218bacdadf5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8d2sc\" (UID: \"d0bb0a20-106f-412e-8ba3-b218bacdadf5\") " pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.381650 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d0bb0a20-106f-412e-8ba3-b218bacdadf5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8d2sc\" (UID: \"d0bb0a20-106f-412e-8ba3-b218bacdadf5\") " pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.483548 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d0bb0a20-106f-412e-8ba3-b218bacdadf5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8d2sc\" (UID: \"d0bb0a20-106f-412e-8ba3-b218bacdadf5\") " pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.483745 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf9tq\" (UniqueName: \"kubernetes.io/projected/d0bb0a20-106f-412e-8ba3-b218bacdadf5-kube-api-access-jf9tq\") pod \"ssh-known-hosts-edpm-deployment-8d2sc\" (UID: \"d0bb0a20-106f-412e-8ba3-b218bacdadf5\") " pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.483915 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0bb0a20-106f-412e-8ba3-b218bacdadf5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8d2sc\" (UID: \"d0bb0a20-106f-412e-8ba3-b218bacdadf5\") " pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.490108 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0bb0a20-106f-412e-8ba3-b218bacdadf5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-8d2sc\" (UID: \"d0bb0a20-106f-412e-8ba3-b218bacdadf5\") " pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.490596 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d0bb0a20-106f-412e-8ba3-b218bacdadf5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-8d2sc\" (UID: \"d0bb0a20-106f-412e-8ba3-b218bacdadf5\") " pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.510898 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf9tq\" (UniqueName: \"kubernetes.io/projected/d0bb0a20-106f-412e-8ba3-b218bacdadf5-kube-api-access-jf9tq\") pod \"ssh-known-hosts-edpm-deployment-8d2sc\" (UID: \"d0bb0a20-106f-412e-8ba3-b218bacdadf5\") " pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" Dec 02 20:35:06 crc kubenswrapper[4807]: I1202 20:35:06.656604 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" Dec 02 20:35:07 crc kubenswrapper[4807]: I1202 20:35:07.030012 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-8d2sc"] Dec 02 20:35:07 crc kubenswrapper[4807]: W1202 20:35:07.032617 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0bb0a20_106f_412e_8ba3_b218bacdadf5.slice/crio-103fbe758a790796beb64885e5bb41b2e39e33321de3c40438a3f61d8394a972 WatchSource:0}: Error finding container 103fbe758a790796beb64885e5bb41b2e39e33321de3c40438a3f61d8394a972: Status 404 returned error can't find the container with id 103fbe758a790796beb64885e5bb41b2e39e33321de3c40438a3f61d8394a972 Dec 02 20:35:07 crc kubenswrapper[4807]: I1202 20:35:07.209155 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" event={"ID":"d0bb0a20-106f-412e-8ba3-b218bacdadf5","Type":"ContainerStarted","Data":"103fbe758a790796beb64885e5bb41b2e39e33321de3c40438a3f61d8394a972"} Dec 02 20:35:08 crc kubenswrapper[4807]: I1202 20:35:08.223204 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" event={"ID":"d0bb0a20-106f-412e-8ba3-b218bacdadf5","Type":"ContainerStarted","Data":"feb47beb22f76d550e9d89c79087f50e3f42087996f9ac23460108597ebb2d03"} Dec 02 20:35:08 crc kubenswrapper[4807]: I1202 20:35:08.257298 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" podStartSLOduration=1.725903647 podStartE2EDuration="2.257267167s" podCreationTimestamp="2025-12-02 20:35:06 +0000 UTC" firstStartedPulling="2025-12-02 20:35:07.035661305 +0000 UTC m=+2242.336568810" lastFinishedPulling="2025-12-02 20:35:07.567024805 +0000 UTC m=+2242.867932330" observedRunningTime="2025-12-02 20:35:08.255041723 +0000 UTC m=+2243.555949228" watchObservedRunningTime="2025-12-02 20:35:08.257267167 +0000 UTC m=+2243.558174672" Dec 02 20:35:16 crc kubenswrapper[4807]: I1202 20:35:16.333595 4807 generic.go:334] "Generic (PLEG): container finished" podID="d0bb0a20-106f-412e-8ba3-b218bacdadf5" containerID="feb47beb22f76d550e9d89c79087f50e3f42087996f9ac23460108597ebb2d03" exitCode=0 Dec 02 20:35:16 crc kubenswrapper[4807]: I1202 20:35:16.333694 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" event={"ID":"d0bb0a20-106f-412e-8ba3-b218bacdadf5","Type":"ContainerDied","Data":"feb47beb22f76d550e9d89c79087f50e3f42087996f9ac23460108597ebb2d03"} Dec 02 20:35:17 crc kubenswrapper[4807]: I1202 20:35:17.885949 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.057531 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d0bb0a20-106f-412e-8ba3-b218bacdadf5-inventory-0\") pod \"d0bb0a20-106f-412e-8ba3-b218bacdadf5\" (UID: \"d0bb0a20-106f-412e-8ba3-b218bacdadf5\") " Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.058500 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf9tq\" (UniqueName: \"kubernetes.io/projected/d0bb0a20-106f-412e-8ba3-b218bacdadf5-kube-api-access-jf9tq\") pod \"d0bb0a20-106f-412e-8ba3-b218bacdadf5\" (UID: \"d0bb0a20-106f-412e-8ba3-b218bacdadf5\") " Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.058993 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0bb0a20-106f-412e-8ba3-b218bacdadf5-ssh-key-openstack-edpm-ipam\") pod \"d0bb0a20-106f-412e-8ba3-b218bacdadf5\" (UID: \"d0bb0a20-106f-412e-8ba3-b218bacdadf5\") " Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.066214 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0bb0a20-106f-412e-8ba3-b218bacdadf5-kube-api-access-jf9tq" (OuterVolumeSpecName: "kube-api-access-jf9tq") pod "d0bb0a20-106f-412e-8ba3-b218bacdadf5" (UID: "d0bb0a20-106f-412e-8ba3-b218bacdadf5"). InnerVolumeSpecName "kube-api-access-jf9tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.094878 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0bb0a20-106f-412e-8ba3-b218bacdadf5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d0bb0a20-106f-412e-8ba3-b218bacdadf5" (UID: "d0bb0a20-106f-412e-8ba3-b218bacdadf5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.120546 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0bb0a20-106f-412e-8ba3-b218bacdadf5-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "d0bb0a20-106f-412e-8ba3-b218bacdadf5" (UID: "d0bb0a20-106f-412e-8ba3-b218bacdadf5"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.162689 4807 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d0bb0a20-106f-412e-8ba3-b218bacdadf5-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.162750 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf9tq\" (UniqueName: \"kubernetes.io/projected/d0bb0a20-106f-412e-8ba3-b218bacdadf5-kube-api-access-jf9tq\") on node \"crc\" DevicePath \"\"" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.162763 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d0bb0a20-106f-412e-8ba3-b218bacdadf5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.362889 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" event={"ID":"d0bb0a20-106f-412e-8ba3-b218bacdadf5","Type":"ContainerDied","Data":"103fbe758a790796beb64885e5bb41b2e39e33321de3c40438a3f61d8394a972"} Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.362954 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="103fbe758a790796beb64885e5bb41b2e39e33321de3c40438a3f61d8394a972" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.363502 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-8d2sc" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.489185 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd"] Dec 02 20:35:18 crc kubenswrapper[4807]: E1202 20:35:18.489675 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0bb0a20-106f-412e-8ba3-b218bacdadf5" containerName="ssh-known-hosts-edpm-deployment" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.489694 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bb0a20-106f-412e-8ba3-b218bacdadf5" containerName="ssh-known-hosts-edpm-deployment" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.489983 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0bb0a20-106f-412e-8ba3-b218bacdadf5" containerName="ssh-known-hosts-edpm-deployment" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.490900 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.494305 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.494469 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.503411 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd"] Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.519516 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.519837 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.570991 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pzgnd\" (UID: \"94998bd3-5f5b-47cd-b14c-39e55cb78eaa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.571326 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdgcp\" (UniqueName: \"kubernetes.io/projected/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-kube-api-access-gdgcp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pzgnd\" (UID: \"94998bd3-5f5b-47cd-b14c-39e55cb78eaa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.571461 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pzgnd\" (UID: \"94998bd3-5f5b-47cd-b14c-39e55cb78eaa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.673426 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdgcp\" (UniqueName: \"kubernetes.io/projected/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-kube-api-access-gdgcp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pzgnd\" (UID: \"94998bd3-5f5b-47cd-b14c-39e55cb78eaa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.673500 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pzgnd\" (UID: \"94998bd3-5f5b-47cd-b14c-39e55cb78eaa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.673577 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pzgnd\" (UID: \"94998bd3-5f5b-47cd-b14c-39e55cb78eaa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.677174 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pzgnd\" (UID: \"94998bd3-5f5b-47cd-b14c-39e55cb78eaa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.678166 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pzgnd\" (UID: \"94998bd3-5f5b-47cd-b14c-39e55cb78eaa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.692753 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdgcp\" (UniqueName: \"kubernetes.io/projected/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-kube-api-access-gdgcp\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-pzgnd\" (UID: \"94998bd3-5f5b-47cd-b14c-39e55cb78eaa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" Dec 02 20:35:18 crc kubenswrapper[4807]: I1202 20:35:18.879566 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" Dec 02 20:35:19 crc kubenswrapper[4807]: I1202 20:35:19.476090 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd"] Dec 02 20:35:20 crc kubenswrapper[4807]: I1202 20:35:20.396628 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" event={"ID":"94998bd3-5f5b-47cd-b14c-39e55cb78eaa","Type":"ContainerStarted","Data":"e4103a6f2082748f68eab61fcb6364a67f0b44ac5cbb7c699ed830bf7e053a36"} Dec 02 20:35:20 crc kubenswrapper[4807]: I1202 20:35:20.396979 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" event={"ID":"94998bd3-5f5b-47cd-b14c-39e55cb78eaa","Type":"ContainerStarted","Data":"9c0a95a060a80ccb19d1c14c33b891cdccbe5bffb532dff0cf586655c0825e27"} Dec 02 20:35:20 crc kubenswrapper[4807]: I1202 20:35:20.426304 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" podStartSLOduration=1.921707532 podStartE2EDuration="2.426283204s" podCreationTimestamp="2025-12-02 20:35:18 +0000 UTC" firstStartedPulling="2025-12-02 20:35:19.48483029 +0000 UTC m=+2254.785737785" lastFinishedPulling="2025-12-02 20:35:19.989405932 +0000 UTC m=+2255.290313457" observedRunningTime="2025-12-02 20:35:20.420023164 +0000 UTC m=+2255.720930669" watchObservedRunningTime="2025-12-02 20:35:20.426283204 +0000 UTC m=+2255.727190709" Dec 02 20:35:27 crc kubenswrapper[4807]: I1202 20:35:27.722350 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mxhgh"] Dec 02 20:35:27 crc kubenswrapper[4807]: I1202 20:35:27.725574 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:27 crc kubenswrapper[4807]: I1202 20:35:27.736533 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mxhgh"] Dec 02 20:35:27 crc kubenswrapper[4807]: I1202 20:35:27.876160 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-utilities\") pod \"redhat-operators-mxhgh\" (UID: \"218ef5ca-7e1a-4db0-acc8-da97cd9d162c\") " pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:27 crc kubenswrapper[4807]: I1202 20:35:27.876538 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-catalog-content\") pod \"redhat-operators-mxhgh\" (UID: \"218ef5ca-7e1a-4db0-acc8-da97cd9d162c\") " pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:27 crc kubenswrapper[4807]: I1202 20:35:27.876605 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgn6n\" (UniqueName: \"kubernetes.io/projected/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-kube-api-access-sgn6n\") pod \"redhat-operators-mxhgh\" (UID: \"218ef5ca-7e1a-4db0-acc8-da97cd9d162c\") " pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:27 crc kubenswrapper[4807]: I1202 20:35:27.979091 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-utilities\") pod \"redhat-operators-mxhgh\" (UID: \"218ef5ca-7e1a-4db0-acc8-da97cd9d162c\") " pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:27 crc kubenswrapper[4807]: I1202 20:35:27.979524 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-catalog-content\") pod \"redhat-operators-mxhgh\" (UID: \"218ef5ca-7e1a-4db0-acc8-da97cd9d162c\") " pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:27 crc kubenswrapper[4807]: I1202 20:35:27.979547 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-utilities\") pod \"redhat-operators-mxhgh\" (UID: \"218ef5ca-7e1a-4db0-acc8-da97cd9d162c\") " pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:27 crc kubenswrapper[4807]: I1202 20:35:27.979734 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgn6n\" (UniqueName: \"kubernetes.io/projected/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-kube-api-access-sgn6n\") pod \"redhat-operators-mxhgh\" (UID: \"218ef5ca-7e1a-4db0-acc8-da97cd9d162c\") " pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:27 crc kubenswrapper[4807]: I1202 20:35:27.979774 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-catalog-content\") pod \"redhat-operators-mxhgh\" (UID: \"218ef5ca-7e1a-4db0-acc8-da97cd9d162c\") " pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:28 crc kubenswrapper[4807]: I1202 20:35:28.000748 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgn6n\" (UniqueName: \"kubernetes.io/projected/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-kube-api-access-sgn6n\") pod \"redhat-operators-mxhgh\" (UID: \"218ef5ca-7e1a-4db0-acc8-da97cd9d162c\") " pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:28 crc kubenswrapper[4807]: I1202 20:35:28.054286 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:28 crc kubenswrapper[4807]: I1202 20:35:28.294447 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:35:28 crc kubenswrapper[4807]: I1202 20:35:28.294691 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:35:28 crc kubenswrapper[4807]: I1202 20:35:28.532190 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mxhgh"] Dec 02 20:35:29 crc kubenswrapper[4807]: I1202 20:35:29.506671 4807 generic.go:334] "Generic (PLEG): container finished" podID="94998bd3-5f5b-47cd-b14c-39e55cb78eaa" containerID="e4103a6f2082748f68eab61fcb6364a67f0b44ac5cbb7c699ed830bf7e053a36" exitCode=0 Dec 02 20:35:29 crc kubenswrapper[4807]: I1202 20:35:29.506776 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" event={"ID":"94998bd3-5f5b-47cd-b14c-39e55cb78eaa","Type":"ContainerDied","Data":"e4103a6f2082748f68eab61fcb6364a67f0b44ac5cbb7c699ed830bf7e053a36"} Dec 02 20:35:29 crc kubenswrapper[4807]: I1202 20:35:29.510687 4807 generic.go:334] "Generic (PLEG): container finished" podID="218ef5ca-7e1a-4db0-acc8-da97cd9d162c" containerID="c603b1e6a9214cad1111a196a0ad9d69c2d0d95555c6047692341fb89a8874e3" exitCode=0 Dec 02 20:35:29 crc kubenswrapper[4807]: I1202 20:35:29.510885 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxhgh" event={"ID":"218ef5ca-7e1a-4db0-acc8-da97cd9d162c","Type":"ContainerDied","Data":"c603b1e6a9214cad1111a196a0ad9d69c2d0d95555c6047692341fb89a8874e3"} Dec 02 20:35:29 crc kubenswrapper[4807]: I1202 20:35:29.511009 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxhgh" event={"ID":"218ef5ca-7e1a-4db0-acc8-da97cd9d162c","Type":"ContainerStarted","Data":"170008e423ac5e4de31208e061b0cc6fac914677cd833f297f360190ae889ad0"} Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.012618 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.092488 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-ssh-key\") pod \"94998bd3-5f5b-47cd-b14c-39e55cb78eaa\" (UID: \"94998bd3-5f5b-47cd-b14c-39e55cb78eaa\") " Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.092543 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-inventory\") pod \"94998bd3-5f5b-47cd-b14c-39e55cb78eaa\" (UID: \"94998bd3-5f5b-47cd-b14c-39e55cb78eaa\") " Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.092606 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdgcp\" (UniqueName: \"kubernetes.io/projected/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-kube-api-access-gdgcp\") pod \"94998bd3-5f5b-47cd-b14c-39e55cb78eaa\" (UID: \"94998bd3-5f5b-47cd-b14c-39e55cb78eaa\") " Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.119185 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-kube-api-access-gdgcp" (OuterVolumeSpecName: "kube-api-access-gdgcp") pod "94998bd3-5f5b-47cd-b14c-39e55cb78eaa" (UID: "94998bd3-5f5b-47cd-b14c-39e55cb78eaa"). InnerVolumeSpecName "kube-api-access-gdgcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.127173 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-inventory" (OuterVolumeSpecName: "inventory") pod "94998bd3-5f5b-47cd-b14c-39e55cb78eaa" (UID: "94998bd3-5f5b-47cd-b14c-39e55cb78eaa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.132452 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "94998bd3-5f5b-47cd-b14c-39e55cb78eaa" (UID: "94998bd3-5f5b-47cd-b14c-39e55cb78eaa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.194345 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.194379 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.194390 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdgcp\" (UniqueName: \"kubernetes.io/projected/94998bd3-5f5b-47cd-b14c-39e55cb78eaa-kube-api-access-gdgcp\") on node \"crc\" DevicePath \"\"" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.533510 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" event={"ID":"94998bd3-5f5b-47cd-b14c-39e55cb78eaa","Type":"ContainerDied","Data":"9c0a95a060a80ccb19d1c14c33b891cdccbe5bffb532dff0cf586655c0825e27"} Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.533543 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-pzgnd" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.533554 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c0a95a060a80ccb19d1c14c33b891cdccbe5bffb532dff0cf586655c0825e27" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.536328 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxhgh" event={"ID":"218ef5ca-7e1a-4db0-acc8-da97cd9d162c","Type":"ContainerStarted","Data":"daa6892d52ca0a219b6ce4169eaf19db9cbc2b7e239f7435de48b1569524a9e3"} Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.620982 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6"] Dec 02 20:35:31 crc kubenswrapper[4807]: E1202 20:35:31.621488 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94998bd3-5f5b-47cd-b14c-39e55cb78eaa" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.621514 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="94998bd3-5f5b-47cd-b14c-39e55cb78eaa" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.621821 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="94998bd3-5f5b-47cd-b14c-39e55cb78eaa" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.622836 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.625008 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.625019 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.625352 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.625365 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.639568 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6"] Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.703635 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tlj8\" (UniqueName: \"kubernetes.io/projected/89a829f1-32e4-4b5b-ba48-196916b1da6f-kube-api-access-2tlj8\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6\" (UID: \"89a829f1-32e4-4b5b-ba48-196916b1da6f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.703760 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89a829f1-32e4-4b5b-ba48-196916b1da6f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6\" (UID: \"89a829f1-32e4-4b5b-ba48-196916b1da6f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.703927 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89a829f1-32e4-4b5b-ba48-196916b1da6f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6\" (UID: \"89a829f1-32e4-4b5b-ba48-196916b1da6f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.808171 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89a829f1-32e4-4b5b-ba48-196916b1da6f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6\" (UID: \"89a829f1-32e4-4b5b-ba48-196916b1da6f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.808429 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tlj8\" (UniqueName: \"kubernetes.io/projected/89a829f1-32e4-4b5b-ba48-196916b1da6f-kube-api-access-2tlj8\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6\" (UID: \"89a829f1-32e4-4b5b-ba48-196916b1da6f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.808597 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89a829f1-32e4-4b5b-ba48-196916b1da6f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6\" (UID: \"89a829f1-32e4-4b5b-ba48-196916b1da6f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.827787 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89a829f1-32e4-4b5b-ba48-196916b1da6f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6\" (UID: \"89a829f1-32e4-4b5b-ba48-196916b1da6f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.831187 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89a829f1-32e4-4b5b-ba48-196916b1da6f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6\" (UID: \"89a829f1-32e4-4b5b-ba48-196916b1da6f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.837144 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tlj8\" (UniqueName: \"kubernetes.io/projected/89a829f1-32e4-4b5b-ba48-196916b1da6f-kube-api-access-2tlj8\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6\" (UID: \"89a829f1-32e4-4b5b-ba48-196916b1da6f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" Dec 02 20:35:31 crc kubenswrapper[4807]: I1202 20:35:31.944287 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" Dec 02 20:35:32 crc kubenswrapper[4807]: I1202 20:35:32.779138 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6"] Dec 02 20:35:33 crc kubenswrapper[4807]: I1202 20:35:33.567017 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" event={"ID":"89a829f1-32e4-4b5b-ba48-196916b1da6f","Type":"ContainerStarted","Data":"4f4d64a13e9dedd6c88369f3ffd53a819bde0ce0bc9a2224f99c7f5f2aee8beb"} Dec 02 20:35:34 crc kubenswrapper[4807]: I1202 20:35:34.584221 4807 generic.go:334] "Generic (PLEG): container finished" podID="218ef5ca-7e1a-4db0-acc8-da97cd9d162c" containerID="daa6892d52ca0a219b6ce4169eaf19db9cbc2b7e239f7435de48b1569524a9e3" exitCode=0 Dec 02 20:35:34 crc kubenswrapper[4807]: I1202 20:35:34.584333 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxhgh" event={"ID":"218ef5ca-7e1a-4db0-acc8-da97cd9d162c","Type":"ContainerDied","Data":"daa6892d52ca0a219b6ce4169eaf19db9cbc2b7e239f7435de48b1569524a9e3"} Dec 02 20:35:35 crc kubenswrapper[4807]: I1202 20:35:35.604837 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" event={"ID":"89a829f1-32e4-4b5b-ba48-196916b1da6f","Type":"ContainerStarted","Data":"74985d8de780639d77377cc50a2bc1ad379835a5bbc5d4214c1b5ce760672b1c"} Dec 02 20:35:35 crc kubenswrapper[4807]: I1202 20:35:35.608202 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxhgh" event={"ID":"218ef5ca-7e1a-4db0-acc8-da97cd9d162c","Type":"ContainerStarted","Data":"a598e238eda3ec348da6dcb99288886f23f9740047578cc58eb3a1abc38a752e"} Dec 02 20:35:35 crc kubenswrapper[4807]: I1202 20:35:35.629095 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" podStartSLOduration=3.084269656 podStartE2EDuration="4.629071102s" podCreationTimestamp="2025-12-02 20:35:31 +0000 UTC" firstStartedPulling="2025-12-02 20:35:32.7966522 +0000 UTC m=+2268.097559685" lastFinishedPulling="2025-12-02 20:35:34.341453636 +0000 UTC m=+2269.642361131" observedRunningTime="2025-12-02 20:35:35.616232314 +0000 UTC m=+2270.917139819" watchObservedRunningTime="2025-12-02 20:35:35.629071102 +0000 UTC m=+2270.929978607" Dec 02 20:35:35 crc kubenswrapper[4807]: I1202 20:35:35.648352 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mxhgh" podStartSLOduration=3.104377684 podStartE2EDuration="8.648331574s" podCreationTimestamp="2025-12-02 20:35:27 +0000 UTC" firstStartedPulling="2025-12-02 20:35:29.514813856 +0000 UTC m=+2264.815721381" lastFinishedPulling="2025-12-02 20:35:35.058767736 +0000 UTC m=+2270.359675271" observedRunningTime="2025-12-02 20:35:35.636428993 +0000 UTC m=+2270.937336508" watchObservedRunningTime="2025-12-02 20:35:35.648331574 +0000 UTC m=+2270.949239079" Dec 02 20:35:38 crc kubenswrapper[4807]: I1202 20:35:38.054638 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:38 crc kubenswrapper[4807]: I1202 20:35:38.055420 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:39 crc kubenswrapper[4807]: I1202 20:35:39.148880 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mxhgh" podUID="218ef5ca-7e1a-4db0-acc8-da97cd9d162c" containerName="registry-server" probeResult="failure" output=< Dec 02 20:35:39 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Dec 02 20:35:39 crc kubenswrapper[4807]: > Dec 02 20:35:45 crc kubenswrapper[4807]: I1202 20:35:45.723686 4807 generic.go:334] "Generic (PLEG): container finished" podID="89a829f1-32e4-4b5b-ba48-196916b1da6f" containerID="74985d8de780639d77377cc50a2bc1ad379835a5bbc5d4214c1b5ce760672b1c" exitCode=0 Dec 02 20:35:45 crc kubenswrapper[4807]: I1202 20:35:45.723791 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" event={"ID":"89a829f1-32e4-4b5b-ba48-196916b1da6f","Type":"ContainerDied","Data":"74985d8de780639d77377cc50a2bc1ad379835a5bbc5d4214c1b5ce760672b1c"} Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.214624 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.379683 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89a829f1-32e4-4b5b-ba48-196916b1da6f-ssh-key\") pod \"89a829f1-32e4-4b5b-ba48-196916b1da6f\" (UID: \"89a829f1-32e4-4b5b-ba48-196916b1da6f\") " Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.380063 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tlj8\" (UniqueName: \"kubernetes.io/projected/89a829f1-32e4-4b5b-ba48-196916b1da6f-kube-api-access-2tlj8\") pod \"89a829f1-32e4-4b5b-ba48-196916b1da6f\" (UID: \"89a829f1-32e4-4b5b-ba48-196916b1da6f\") " Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.380162 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89a829f1-32e4-4b5b-ba48-196916b1da6f-inventory\") pod \"89a829f1-32e4-4b5b-ba48-196916b1da6f\" (UID: \"89a829f1-32e4-4b5b-ba48-196916b1da6f\") " Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.386437 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89a829f1-32e4-4b5b-ba48-196916b1da6f-kube-api-access-2tlj8" (OuterVolumeSpecName: "kube-api-access-2tlj8") pod "89a829f1-32e4-4b5b-ba48-196916b1da6f" (UID: "89a829f1-32e4-4b5b-ba48-196916b1da6f"). InnerVolumeSpecName "kube-api-access-2tlj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.409350 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89a829f1-32e4-4b5b-ba48-196916b1da6f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "89a829f1-32e4-4b5b-ba48-196916b1da6f" (UID: "89a829f1-32e4-4b5b-ba48-196916b1da6f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.416251 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89a829f1-32e4-4b5b-ba48-196916b1da6f-inventory" (OuterVolumeSpecName: "inventory") pod "89a829f1-32e4-4b5b-ba48-196916b1da6f" (UID: "89a829f1-32e4-4b5b-ba48-196916b1da6f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.482696 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89a829f1-32e4-4b5b-ba48-196916b1da6f-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.482788 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89a829f1-32e4-4b5b-ba48-196916b1da6f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.482801 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tlj8\" (UniqueName: \"kubernetes.io/projected/89a829f1-32e4-4b5b-ba48-196916b1da6f-kube-api-access-2tlj8\") on node \"crc\" DevicePath \"\"" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.756680 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" event={"ID":"89a829f1-32e4-4b5b-ba48-196916b1da6f","Type":"ContainerDied","Data":"4f4d64a13e9dedd6c88369f3ffd53a819bde0ce0bc9a2224f99c7f5f2aee8beb"} Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.756799 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f4d64a13e9dedd6c88369f3ffd53a819bde0ce0bc9a2224f99c7f5f2aee8beb" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.756928 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.895065 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x"] Dec 02 20:35:47 crc kubenswrapper[4807]: E1202 20:35:47.895932 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a829f1-32e4-4b5b-ba48-196916b1da6f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.895975 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a829f1-32e4-4b5b-ba48-196916b1da6f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.896409 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a829f1-32e4-4b5b-ba48-196916b1da6f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.897868 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.906963 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.907359 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.907598 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.907912 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.908885 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.910179 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.910450 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.911275 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 02 20:35:47 crc kubenswrapper[4807]: I1202 20:35:47.914388 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x"] Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.101092 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.101157 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-487xw\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-kube-api-access-487xw\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.101190 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.101218 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.101495 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.101627 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.101788 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.101864 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.101922 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.102009 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.102092 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.102612 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.102648 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.102775 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.107708 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.164929 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.205544 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.205624 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-487xw\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-kube-api-access-487xw\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.205673 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.205736 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.205809 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.205854 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.205897 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.205938 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.205977 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.206026 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.206073 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.206233 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.206285 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.206412 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.211816 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.211896 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.212376 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.212976 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.213049 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.213818 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.214288 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.214642 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.215457 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.215869 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.216274 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.217203 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.217783 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.222286 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-487xw\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-kube-api-access-487xw\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.228142 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.379045 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mxhgh"] Dec 02 20:35:48 crc kubenswrapper[4807]: I1202 20:35:48.895371 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x"] Dec 02 20:35:49 crc kubenswrapper[4807]: I1202 20:35:49.779882 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" event={"ID":"64a6a7a0-63cc-48bb-a936-21fbab3123e9","Type":"ContainerStarted","Data":"38f112403182beb8f62c168211a417dce7c9418280696f3b721cae4726a23838"} Dec 02 20:35:49 crc kubenswrapper[4807]: I1202 20:35:49.780129 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mxhgh" podUID="218ef5ca-7e1a-4db0-acc8-da97cd9d162c" containerName="registry-server" containerID="cri-o://a598e238eda3ec348da6dcb99288886f23f9740047578cc58eb3a1abc38a752e" gracePeriod=2 Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.247738 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.363296 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-utilities\") pod \"218ef5ca-7e1a-4db0-acc8-da97cd9d162c\" (UID: \"218ef5ca-7e1a-4db0-acc8-da97cd9d162c\") " Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.363343 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgn6n\" (UniqueName: \"kubernetes.io/projected/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-kube-api-access-sgn6n\") pod \"218ef5ca-7e1a-4db0-acc8-da97cd9d162c\" (UID: \"218ef5ca-7e1a-4db0-acc8-da97cd9d162c\") " Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.363403 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-catalog-content\") pod \"218ef5ca-7e1a-4db0-acc8-da97cd9d162c\" (UID: \"218ef5ca-7e1a-4db0-acc8-da97cd9d162c\") " Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.364369 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-utilities" (OuterVolumeSpecName: "utilities") pod "218ef5ca-7e1a-4db0-acc8-da97cd9d162c" (UID: "218ef5ca-7e1a-4db0-acc8-da97cd9d162c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.370179 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-kube-api-access-sgn6n" (OuterVolumeSpecName: "kube-api-access-sgn6n") pod "218ef5ca-7e1a-4db0-acc8-da97cd9d162c" (UID: "218ef5ca-7e1a-4db0-acc8-da97cd9d162c"). InnerVolumeSpecName "kube-api-access-sgn6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.465626 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.465935 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgn6n\" (UniqueName: \"kubernetes.io/projected/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-kube-api-access-sgn6n\") on node \"crc\" DevicePath \"\"" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.480771 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "218ef5ca-7e1a-4db0-acc8-da97cd9d162c" (UID: "218ef5ca-7e1a-4db0-acc8-da97cd9d162c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.567433 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218ef5ca-7e1a-4db0-acc8-da97cd9d162c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.800441 4807 generic.go:334] "Generic (PLEG): container finished" podID="218ef5ca-7e1a-4db0-acc8-da97cd9d162c" containerID="a598e238eda3ec348da6dcb99288886f23f9740047578cc58eb3a1abc38a752e" exitCode=0 Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.800508 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxhgh" event={"ID":"218ef5ca-7e1a-4db0-acc8-da97cd9d162c","Type":"ContainerDied","Data":"a598e238eda3ec348da6dcb99288886f23f9740047578cc58eb3a1abc38a752e"} Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.800573 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mxhgh" event={"ID":"218ef5ca-7e1a-4db0-acc8-da97cd9d162c","Type":"ContainerDied","Data":"170008e423ac5e4de31208e061b0cc6fac914677cd833f297f360190ae889ad0"} Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.800604 4807 scope.go:117] "RemoveContainer" containerID="a598e238eda3ec348da6dcb99288886f23f9740047578cc58eb3a1abc38a752e" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.801912 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mxhgh" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.805266 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" event={"ID":"64a6a7a0-63cc-48bb-a936-21fbab3123e9","Type":"ContainerStarted","Data":"6acb47470f4e527e3565d0a1ec279b5f1638e270b892d3c3d72b4c5f0f556a40"} Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.836283 4807 scope.go:117] "RemoveContainer" containerID="daa6892d52ca0a219b6ce4169eaf19db9cbc2b7e239f7435de48b1569524a9e3" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.840386 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" podStartSLOduration=3.141244736 podStartE2EDuration="3.840322524s" podCreationTimestamp="2025-12-02 20:35:47 +0000 UTC" firstStartedPulling="2025-12-02 20:35:48.894478301 +0000 UTC m=+2284.195385826" lastFinishedPulling="2025-12-02 20:35:49.593556079 +0000 UTC m=+2284.894463614" observedRunningTime="2025-12-02 20:35:50.828484044 +0000 UTC m=+2286.129391609" watchObservedRunningTime="2025-12-02 20:35:50.840322524 +0000 UTC m=+2286.141230049" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.861057 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mxhgh"] Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.870540 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mxhgh"] Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.875240 4807 scope.go:117] "RemoveContainer" containerID="c603b1e6a9214cad1111a196a0ad9d69c2d0d95555c6047692341fb89a8874e3" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.932510 4807 scope.go:117] "RemoveContainer" containerID="a598e238eda3ec348da6dcb99288886f23f9740047578cc58eb3a1abc38a752e" Dec 02 20:35:50 crc kubenswrapper[4807]: E1202 20:35:50.933127 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a598e238eda3ec348da6dcb99288886f23f9740047578cc58eb3a1abc38a752e\": container with ID starting with a598e238eda3ec348da6dcb99288886f23f9740047578cc58eb3a1abc38a752e not found: ID does not exist" containerID="a598e238eda3ec348da6dcb99288886f23f9740047578cc58eb3a1abc38a752e" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.933197 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a598e238eda3ec348da6dcb99288886f23f9740047578cc58eb3a1abc38a752e"} err="failed to get container status \"a598e238eda3ec348da6dcb99288886f23f9740047578cc58eb3a1abc38a752e\": rpc error: code = NotFound desc = could not find container \"a598e238eda3ec348da6dcb99288886f23f9740047578cc58eb3a1abc38a752e\": container with ID starting with a598e238eda3ec348da6dcb99288886f23f9740047578cc58eb3a1abc38a752e not found: ID does not exist" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.933237 4807 scope.go:117] "RemoveContainer" containerID="daa6892d52ca0a219b6ce4169eaf19db9cbc2b7e239f7435de48b1569524a9e3" Dec 02 20:35:50 crc kubenswrapper[4807]: E1202 20:35:50.934166 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daa6892d52ca0a219b6ce4169eaf19db9cbc2b7e239f7435de48b1569524a9e3\": container with ID starting with daa6892d52ca0a219b6ce4169eaf19db9cbc2b7e239f7435de48b1569524a9e3 not found: ID does not exist" containerID="daa6892d52ca0a219b6ce4169eaf19db9cbc2b7e239f7435de48b1569524a9e3" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.934226 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa6892d52ca0a219b6ce4169eaf19db9cbc2b7e239f7435de48b1569524a9e3"} err="failed to get container status \"daa6892d52ca0a219b6ce4169eaf19db9cbc2b7e239f7435de48b1569524a9e3\": rpc error: code = NotFound desc = could not find container \"daa6892d52ca0a219b6ce4169eaf19db9cbc2b7e239f7435de48b1569524a9e3\": container with ID starting with daa6892d52ca0a219b6ce4169eaf19db9cbc2b7e239f7435de48b1569524a9e3 not found: ID does not exist" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.934263 4807 scope.go:117] "RemoveContainer" containerID="c603b1e6a9214cad1111a196a0ad9d69c2d0d95555c6047692341fb89a8874e3" Dec 02 20:35:50 crc kubenswrapper[4807]: E1202 20:35:50.934788 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c603b1e6a9214cad1111a196a0ad9d69c2d0d95555c6047692341fb89a8874e3\": container with ID starting with c603b1e6a9214cad1111a196a0ad9d69c2d0d95555c6047692341fb89a8874e3 not found: ID does not exist" containerID="c603b1e6a9214cad1111a196a0ad9d69c2d0d95555c6047692341fb89a8874e3" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.934883 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c603b1e6a9214cad1111a196a0ad9d69c2d0d95555c6047692341fb89a8874e3"} err="failed to get container status \"c603b1e6a9214cad1111a196a0ad9d69c2d0d95555c6047692341fb89a8874e3\": rpc error: code = NotFound desc = could not find container \"c603b1e6a9214cad1111a196a0ad9d69c2d0d95555c6047692341fb89a8874e3\": container with ID starting with c603b1e6a9214cad1111a196a0ad9d69c2d0d95555c6047692341fb89a8874e3 not found: ID does not exist" Dec 02 20:35:50 crc kubenswrapper[4807]: I1202 20:35:50.993092 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="218ef5ca-7e1a-4db0-acc8-da97cd9d162c" path="/var/lib/kubelet/pods/218ef5ca-7e1a-4db0-acc8-da97cd9d162c/volumes" Dec 02 20:35:58 crc kubenswrapper[4807]: I1202 20:35:58.293024 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:35:58 crc kubenswrapper[4807]: I1202 20:35:58.293603 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:35:58 crc kubenswrapper[4807]: I1202 20:35:58.293698 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 20:35:58 crc kubenswrapper[4807]: I1202 20:35:58.294592 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 20:35:58 crc kubenswrapper[4807]: I1202 20:35:58.294666 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" gracePeriod=600 Dec 02 20:35:58 crc kubenswrapper[4807]: E1202 20:35:58.430481 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:35:58 crc kubenswrapper[4807]: I1202 20:35:58.900373 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" exitCode=0 Dec 02 20:35:58 crc kubenswrapper[4807]: I1202 20:35:58.900455 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4"} Dec 02 20:35:58 crc kubenswrapper[4807]: I1202 20:35:58.900508 4807 scope.go:117] "RemoveContainer" containerID="083b747738a7b266cd838719e592adc712e0ead5027434ea8e6c8467bbbf14c4" Dec 02 20:35:58 crc kubenswrapper[4807]: I1202 20:35:58.901614 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:35:58 crc kubenswrapper[4807]: E1202 20:35:58.902142 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:36:10 crc kubenswrapper[4807]: I1202 20:36:10.973052 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:36:10 crc kubenswrapper[4807]: E1202 20:36:10.974501 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:36:25 crc kubenswrapper[4807]: I1202 20:36:25.972976 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:36:25 crc kubenswrapper[4807]: E1202 20:36:25.973827 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:36:34 crc kubenswrapper[4807]: I1202 20:36:34.323987 4807 generic.go:334] "Generic (PLEG): container finished" podID="64a6a7a0-63cc-48bb-a936-21fbab3123e9" containerID="6acb47470f4e527e3565d0a1ec279b5f1638e270b892d3c3d72b4c5f0f556a40" exitCode=0 Dec 02 20:36:34 crc kubenswrapper[4807]: I1202 20:36:34.324084 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" event={"ID":"64a6a7a0-63cc-48bb-a936-21fbab3123e9","Type":"ContainerDied","Data":"6acb47470f4e527e3565d0a1ec279b5f1638e270b892d3c3d72b4c5f0f556a40"} Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.885030 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.904303 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-nova-combined-ca-bundle\") pod \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.904415 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-487xw\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-kube-api-access-487xw\") pod \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.904453 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.904480 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-telemetry-combined-ca-bundle\") pod \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.904535 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.904586 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-bootstrap-combined-ca-bundle\") pod \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.904619 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-repo-setup-combined-ca-bundle\") pod \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.904661 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.904694 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-ovn-combined-ca-bundle\") pod \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.904832 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-inventory\") pod \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.906219 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.906298 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-libvirt-combined-ca-bundle\") pod \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.906347 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-neutron-metadata-combined-ca-bundle\") pod \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.906402 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-ssh-key\") pod \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\" (UID: \"64a6a7a0-63cc-48bb-a936-21fbab3123e9\") " Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.917519 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-kube-api-access-487xw" (OuterVolumeSpecName: "kube-api-access-487xw") pod "64a6a7a0-63cc-48bb-a936-21fbab3123e9" (UID: "64a6a7a0-63cc-48bb-a936-21fbab3123e9"). InnerVolumeSpecName "kube-api-access-487xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.917930 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "64a6a7a0-63cc-48bb-a936-21fbab3123e9" (UID: "64a6a7a0-63cc-48bb-a936-21fbab3123e9"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.920081 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "64a6a7a0-63cc-48bb-a936-21fbab3123e9" (UID: "64a6a7a0-63cc-48bb-a936-21fbab3123e9"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.920137 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "64a6a7a0-63cc-48bb-a936-21fbab3123e9" (UID: "64a6a7a0-63cc-48bb-a936-21fbab3123e9"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.920308 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "64a6a7a0-63cc-48bb-a936-21fbab3123e9" (UID: "64a6a7a0-63cc-48bb-a936-21fbab3123e9"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.924198 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "64a6a7a0-63cc-48bb-a936-21fbab3123e9" (UID: "64a6a7a0-63cc-48bb-a936-21fbab3123e9"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.924533 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "64a6a7a0-63cc-48bb-a936-21fbab3123e9" (UID: "64a6a7a0-63cc-48bb-a936-21fbab3123e9"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.926431 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "64a6a7a0-63cc-48bb-a936-21fbab3123e9" (UID: "64a6a7a0-63cc-48bb-a936-21fbab3123e9"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.931186 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "64a6a7a0-63cc-48bb-a936-21fbab3123e9" (UID: "64a6a7a0-63cc-48bb-a936-21fbab3123e9"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.931847 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "64a6a7a0-63cc-48bb-a936-21fbab3123e9" (UID: "64a6a7a0-63cc-48bb-a936-21fbab3123e9"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.936375 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "64a6a7a0-63cc-48bb-a936-21fbab3123e9" (UID: "64a6a7a0-63cc-48bb-a936-21fbab3123e9"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.936108 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "64a6a7a0-63cc-48bb-a936-21fbab3123e9" (UID: "64a6a7a0-63cc-48bb-a936-21fbab3123e9"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.969787 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "64a6a7a0-63cc-48bb-a936-21fbab3123e9" (UID: "64a6a7a0-63cc-48bb-a936-21fbab3123e9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:36:35 crc kubenswrapper[4807]: I1202 20:36:35.978504 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-inventory" (OuterVolumeSpecName: "inventory") pod "64a6a7a0-63cc-48bb-a936-21fbab3123e9" (UID: "64a6a7a0-63cc-48bb-a936-21fbab3123e9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.011890 4807 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.011936 4807 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.011954 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.011970 4807 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.011986 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-487xw\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-kube-api-access-487xw\") on node \"crc\" DevicePath \"\"" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.012004 4807 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.012020 4807 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.012036 4807 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.012051 4807 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.012067 4807 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.012110 4807 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.012127 4807 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.012142 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64a6a7a0-63cc-48bb-a936-21fbab3123e9-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.012158 4807 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/64a6a7a0-63cc-48bb-a936-21fbab3123e9-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.356708 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" event={"ID":"64a6a7a0-63cc-48bb-a936-21fbab3123e9","Type":"ContainerDied","Data":"38f112403182beb8f62c168211a417dce7c9418280696f3b721cae4726a23838"} Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.356849 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38f112403182beb8f62c168211a417dce7c9418280696f3b721cae4726a23838" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.356983 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.476480 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk"] Dec 02 20:36:36 crc kubenswrapper[4807]: E1202 20:36:36.477475 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a6a7a0-63cc-48bb-a936-21fbab3123e9" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.477510 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a6a7a0-63cc-48bb-a936-21fbab3123e9" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 20:36:36 crc kubenswrapper[4807]: E1202 20:36:36.477541 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218ef5ca-7e1a-4db0-acc8-da97cd9d162c" containerName="extract-utilities" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.477555 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="218ef5ca-7e1a-4db0-acc8-da97cd9d162c" containerName="extract-utilities" Dec 02 20:36:36 crc kubenswrapper[4807]: E1202 20:36:36.477611 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218ef5ca-7e1a-4db0-acc8-da97cd9d162c" containerName="registry-server" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.477625 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="218ef5ca-7e1a-4db0-acc8-da97cd9d162c" containerName="registry-server" Dec 02 20:36:36 crc kubenswrapper[4807]: E1202 20:36:36.477659 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218ef5ca-7e1a-4db0-acc8-da97cd9d162c" containerName="extract-content" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.477676 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="218ef5ca-7e1a-4db0-acc8-da97cd9d162c" containerName="extract-content" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.478082 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="218ef5ca-7e1a-4db0-acc8-da97cd9d162c" containerName="registry-server" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.478134 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="64a6a7a0-63cc-48bb-a936-21fbab3123e9" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.479578 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.482888 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.482938 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.483038 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.482938 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.483386 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.491822 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk"] Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.524747 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9q5fk\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.524903 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9q5fk\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.525087 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7nvl\" (UniqueName: \"kubernetes.io/projected/8b4b56e1-9070-4a42-beff-c3d9324e820c-kube-api-access-v7nvl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9q5fk\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.525154 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9q5fk\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.525303 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b4b56e1-9070-4a42-beff-c3d9324e820c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9q5fk\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.627294 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9q5fk\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.627376 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7nvl\" (UniqueName: \"kubernetes.io/projected/8b4b56e1-9070-4a42-beff-c3d9324e820c-kube-api-access-v7nvl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9q5fk\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.627404 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9q5fk\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.627476 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b4b56e1-9070-4a42-beff-c3d9324e820c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9q5fk\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.627526 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9q5fk\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.629445 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b4b56e1-9070-4a42-beff-c3d9324e820c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9q5fk\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.631649 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9q5fk\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.631876 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9q5fk\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.632269 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9q5fk\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.647220 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7nvl\" (UniqueName: \"kubernetes.io/projected/8b4b56e1-9070-4a42-beff-c3d9324e820c-kube-api-access-v7nvl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9q5fk\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:36 crc kubenswrapper[4807]: I1202 20:36:36.809760 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:36:37 crc kubenswrapper[4807]: I1202 20:36:37.188913 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk"] Dec 02 20:36:37 crc kubenswrapper[4807]: W1202 20:36:37.195167 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b4b56e1_9070_4a42_beff_c3d9324e820c.slice/crio-2db288d4614814c0125c9a6a5fdd45087c1b0830cb18bb099678b3016d4024dd WatchSource:0}: Error finding container 2db288d4614814c0125c9a6a5fdd45087c1b0830cb18bb099678b3016d4024dd: Status 404 returned error can't find the container with id 2db288d4614814c0125c9a6a5fdd45087c1b0830cb18bb099678b3016d4024dd Dec 02 20:36:37 crc kubenswrapper[4807]: I1202 20:36:37.371601 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" event={"ID":"8b4b56e1-9070-4a42-beff-c3d9324e820c","Type":"ContainerStarted","Data":"2db288d4614814c0125c9a6a5fdd45087c1b0830cb18bb099678b3016d4024dd"} Dec 02 20:36:38 crc kubenswrapper[4807]: I1202 20:36:38.384233 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" event={"ID":"8b4b56e1-9070-4a42-beff-c3d9324e820c","Type":"ContainerStarted","Data":"c3ba85d32ad0636c6d4890b7db64af035c33546994ce7fcbf4f31c5af0848337"} Dec 02 20:36:38 crc kubenswrapper[4807]: I1202 20:36:38.414146 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" podStartSLOduration=1.853555287 podStartE2EDuration="2.414118113s" podCreationTimestamp="2025-12-02 20:36:36 +0000 UTC" firstStartedPulling="2025-12-02 20:36:37.197658898 +0000 UTC m=+2332.498566403" lastFinishedPulling="2025-12-02 20:36:37.758221694 +0000 UTC m=+2333.059129229" observedRunningTime="2025-12-02 20:36:38.399984668 +0000 UTC m=+2333.700892163" watchObservedRunningTime="2025-12-02 20:36:38.414118113 +0000 UTC m=+2333.715025608" Dec 02 20:36:39 crc kubenswrapper[4807]: I1202 20:36:39.973815 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:36:39 crc kubenswrapper[4807]: E1202 20:36:39.974558 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:36:51 crc kubenswrapper[4807]: I1202 20:36:51.974216 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:36:51 crc kubenswrapper[4807]: E1202 20:36:51.975417 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:37:05 crc kubenswrapper[4807]: I1202 20:37:05.973263 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:37:05 crc kubenswrapper[4807]: E1202 20:37:05.977852 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:37:17 crc kubenswrapper[4807]: I1202 20:37:17.973041 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:37:17 crc kubenswrapper[4807]: E1202 20:37:17.974140 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:37:30 crc kubenswrapper[4807]: I1202 20:37:30.972769 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:37:30 crc kubenswrapper[4807]: E1202 20:37:30.974250 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:37:43 crc kubenswrapper[4807]: I1202 20:37:43.972770 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:37:43 crc kubenswrapper[4807]: E1202 20:37:43.974008 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:37:54 crc kubenswrapper[4807]: I1202 20:37:54.280966 4807 generic.go:334] "Generic (PLEG): container finished" podID="8b4b56e1-9070-4a42-beff-c3d9324e820c" containerID="c3ba85d32ad0636c6d4890b7db64af035c33546994ce7fcbf4f31c5af0848337" exitCode=0 Dec 02 20:37:54 crc kubenswrapper[4807]: I1202 20:37:54.281586 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" event={"ID":"8b4b56e1-9070-4a42-beff-c3d9324e820c","Type":"ContainerDied","Data":"c3ba85d32ad0636c6d4890b7db64af035c33546994ce7fcbf4f31c5af0848337"} Dec 02 20:37:55 crc kubenswrapper[4807]: I1202 20:37:55.845262 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:37:55 crc kubenswrapper[4807]: I1202 20:37:55.977189 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-ssh-key\") pod \"8b4b56e1-9070-4a42-beff-c3d9324e820c\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " Dec 02 20:37:55 crc kubenswrapper[4807]: I1202 20:37:55.977387 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b4b56e1-9070-4a42-beff-c3d9324e820c-ovncontroller-config-0\") pod \"8b4b56e1-9070-4a42-beff-c3d9324e820c\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " Dec 02 20:37:55 crc kubenswrapper[4807]: I1202 20:37:55.977411 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-inventory\") pod \"8b4b56e1-9070-4a42-beff-c3d9324e820c\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " Dec 02 20:37:55 crc kubenswrapper[4807]: I1202 20:37:55.977458 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7nvl\" (UniqueName: \"kubernetes.io/projected/8b4b56e1-9070-4a42-beff-c3d9324e820c-kube-api-access-v7nvl\") pod \"8b4b56e1-9070-4a42-beff-c3d9324e820c\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " Dec 02 20:37:55 crc kubenswrapper[4807]: I1202 20:37:55.977566 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-ovn-combined-ca-bundle\") pod \"8b4b56e1-9070-4a42-beff-c3d9324e820c\" (UID: \"8b4b56e1-9070-4a42-beff-c3d9324e820c\") " Dec 02 20:37:55 crc kubenswrapper[4807]: I1202 20:37:55.998605 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8b4b56e1-9070-4a42-beff-c3d9324e820c" (UID: "8b4b56e1-9070-4a42-beff-c3d9324e820c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:37:55 crc kubenswrapper[4807]: I1202 20:37:55.998637 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b4b56e1-9070-4a42-beff-c3d9324e820c-kube-api-access-v7nvl" (OuterVolumeSpecName: "kube-api-access-v7nvl") pod "8b4b56e1-9070-4a42-beff-c3d9324e820c" (UID: "8b4b56e1-9070-4a42-beff-c3d9324e820c"). InnerVolumeSpecName "kube-api-access-v7nvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.027171 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b4b56e1-9070-4a42-beff-c3d9324e820c-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "8b4b56e1-9070-4a42-beff-c3d9324e820c" (UID: "8b4b56e1-9070-4a42-beff-c3d9324e820c"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.030563 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8b4b56e1-9070-4a42-beff-c3d9324e820c" (UID: "8b4b56e1-9070-4a42-beff-c3d9324e820c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.041214 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-inventory" (OuterVolumeSpecName: "inventory") pod "8b4b56e1-9070-4a42-beff-c3d9324e820c" (UID: "8b4b56e1-9070-4a42-beff-c3d9324e820c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.079738 4807 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.079980 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.080052 4807 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8b4b56e1-9070-4a42-beff-c3d9324e820c-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.080112 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b4b56e1-9070-4a42-beff-c3d9324e820c-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.080165 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7nvl\" (UniqueName: \"kubernetes.io/projected/8b4b56e1-9070-4a42-beff-c3d9324e820c-kube-api-access-v7nvl\") on node \"crc\" DevicePath \"\"" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.315500 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" event={"ID":"8b4b56e1-9070-4a42-beff-c3d9324e820c","Type":"ContainerDied","Data":"2db288d4614814c0125c9a6a5fdd45087c1b0830cb18bb099678b3016d4024dd"} Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.315561 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2db288d4614814c0125c9a6a5fdd45087c1b0830cb18bb099678b3016d4024dd" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.315651 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9q5fk" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.478535 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x"] Dec 02 20:37:56 crc kubenswrapper[4807]: E1202 20:37:56.479285 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b4b56e1-9070-4a42-beff-c3d9324e820c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.479319 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b4b56e1-9070-4a42-beff-c3d9324e820c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.479654 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b4b56e1-9070-4a42-beff-c3d9324e820c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.480884 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.495932 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x"] Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.516973 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.517148 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.517169 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.517283 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.517480 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.518129 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.597640 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.597711 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.597939 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv84v\" (UniqueName: \"kubernetes.io/projected/f2cd5e17-9097-447f-8fcf-7e95a2621845-kube-api-access-dv84v\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.598214 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.598342 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.598465 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.700626 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.700692 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.701574 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv84v\" (UniqueName: \"kubernetes.io/projected/f2cd5e17-9097-447f-8fcf-7e95a2621845-kube-api-access-dv84v\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.702626 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.702747 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.703853 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.707990 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.712133 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.713437 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.714943 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.720290 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.724754 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv84v\" (UniqueName: \"kubernetes.io/projected/f2cd5e17-9097-447f-8fcf-7e95a2621845-kube-api-access-dv84v\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:56 crc kubenswrapper[4807]: I1202 20:37:56.828420 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:37:57 crc kubenswrapper[4807]: I1202 20:37:57.444439 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x"] Dec 02 20:37:57 crc kubenswrapper[4807]: I1202 20:37:57.452849 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 20:37:58 crc kubenswrapper[4807]: I1202 20:37:58.342023 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" event={"ID":"f2cd5e17-9097-447f-8fcf-7e95a2621845","Type":"ContainerStarted","Data":"5eb6797cdfabcf8618cb1c04ba4a5b609ae2d64126e970daa38cc6ab1fcec844"} Dec 02 20:37:58 crc kubenswrapper[4807]: I1202 20:37:58.342378 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" event={"ID":"f2cd5e17-9097-447f-8fcf-7e95a2621845","Type":"ContainerStarted","Data":"cb3abc6c123dc77a77b0606d46ac9bbd325933ec4b2657977aca64d63fb2b5c9"} Dec 02 20:37:58 crc kubenswrapper[4807]: I1202 20:37:58.367781 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" podStartSLOduration=1.768533199 podStartE2EDuration="2.367759373s" podCreationTimestamp="2025-12-02 20:37:56 +0000 UTC" firstStartedPulling="2025-12-02 20:37:57.452543363 +0000 UTC m=+2412.753450868" lastFinishedPulling="2025-12-02 20:37:58.051769537 +0000 UTC m=+2413.352677042" observedRunningTime="2025-12-02 20:37:58.36344725 +0000 UTC m=+2413.664354755" watchObservedRunningTime="2025-12-02 20:37:58.367759373 +0000 UTC m=+2413.668666868" Dec 02 20:37:58 crc kubenswrapper[4807]: I1202 20:37:58.973414 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:37:58 crc kubenswrapper[4807]: E1202 20:37:58.974180 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:38:09 crc kubenswrapper[4807]: I1202 20:38:09.972910 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:38:09 crc kubenswrapper[4807]: E1202 20:38:09.973803 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:38:20 crc kubenswrapper[4807]: I1202 20:38:20.973191 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:38:20 crc kubenswrapper[4807]: E1202 20:38:20.975476 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:38:35 crc kubenswrapper[4807]: I1202 20:38:35.973019 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:38:35 crc kubenswrapper[4807]: E1202 20:38:35.974172 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:38:50 crc kubenswrapper[4807]: I1202 20:38:50.973347 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:38:50 crc kubenswrapper[4807]: E1202 20:38:50.974947 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:38:55 crc kubenswrapper[4807]: I1202 20:38:55.056935 4807 generic.go:334] "Generic (PLEG): container finished" podID="f2cd5e17-9097-447f-8fcf-7e95a2621845" containerID="5eb6797cdfabcf8618cb1c04ba4a5b609ae2d64126e970daa38cc6ab1fcec844" exitCode=0 Dec 02 20:38:55 crc kubenswrapper[4807]: I1202 20:38:55.057099 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" event={"ID":"f2cd5e17-9097-447f-8fcf-7e95a2621845","Type":"ContainerDied","Data":"5eb6797cdfabcf8618cb1c04ba4a5b609ae2d64126e970daa38cc6ab1fcec844"} Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.587033 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.658960 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-neutron-metadata-combined-ca-bundle\") pod \"f2cd5e17-9097-447f-8fcf-7e95a2621845\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.659081 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-ssh-key\") pod \"f2cd5e17-9097-447f-8fcf-7e95a2621845\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.659282 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-neutron-ovn-metadata-agent-neutron-config-0\") pod \"f2cd5e17-9097-447f-8fcf-7e95a2621845\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.659333 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-nova-metadata-neutron-config-0\") pod \"f2cd5e17-9097-447f-8fcf-7e95a2621845\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.659361 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv84v\" (UniqueName: \"kubernetes.io/projected/f2cd5e17-9097-447f-8fcf-7e95a2621845-kube-api-access-dv84v\") pod \"f2cd5e17-9097-447f-8fcf-7e95a2621845\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.659464 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-inventory\") pod \"f2cd5e17-9097-447f-8fcf-7e95a2621845\" (UID: \"f2cd5e17-9097-447f-8fcf-7e95a2621845\") " Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.667933 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f2cd5e17-9097-447f-8fcf-7e95a2621845" (UID: "f2cd5e17-9097-447f-8fcf-7e95a2621845"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.668686 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2cd5e17-9097-447f-8fcf-7e95a2621845-kube-api-access-dv84v" (OuterVolumeSpecName: "kube-api-access-dv84v") pod "f2cd5e17-9097-447f-8fcf-7e95a2621845" (UID: "f2cd5e17-9097-447f-8fcf-7e95a2621845"). InnerVolumeSpecName "kube-api-access-dv84v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.698032 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f2cd5e17-9097-447f-8fcf-7e95a2621845" (UID: "f2cd5e17-9097-447f-8fcf-7e95a2621845"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.698525 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "f2cd5e17-9097-447f-8fcf-7e95a2621845" (UID: "f2cd5e17-9097-447f-8fcf-7e95a2621845"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.702454 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "f2cd5e17-9097-447f-8fcf-7e95a2621845" (UID: "f2cd5e17-9097-447f-8fcf-7e95a2621845"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.702471 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-inventory" (OuterVolumeSpecName: "inventory") pod "f2cd5e17-9097-447f-8fcf-7e95a2621845" (UID: "f2cd5e17-9097-447f-8fcf-7e95a2621845"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.762295 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.762333 4807 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.762345 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.762357 4807 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.762370 4807 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f2cd5e17-9097-447f-8fcf-7e95a2621845-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:38:56 crc kubenswrapper[4807]: I1202 20:38:56.762388 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv84v\" (UniqueName: \"kubernetes.io/projected/f2cd5e17-9097-447f-8fcf-7e95a2621845-kube-api-access-dv84v\") on node \"crc\" DevicePath \"\"" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.083968 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" event={"ID":"f2cd5e17-9097-447f-8fcf-7e95a2621845","Type":"ContainerDied","Data":"cb3abc6c123dc77a77b0606d46ac9bbd325933ec4b2657977aca64d63fb2b5c9"} Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.084010 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.084028 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb3abc6c123dc77a77b0606d46ac9bbd325933ec4b2657977aca64d63fb2b5c9" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.250706 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf"] Dec 02 20:38:57 crc kubenswrapper[4807]: E1202 20:38:57.251626 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2cd5e17-9097-447f-8fcf-7e95a2621845" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.251669 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2cd5e17-9097-447f-8fcf-7e95a2621845" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.252092 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2cd5e17-9097-447f-8fcf-7e95a2621845" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.253836 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.257516 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.258180 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.258551 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.259009 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.259629 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.276833 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf"] Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.379860 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.380267 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.380399 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.380436 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.380587 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4wnn\" (UniqueName: \"kubernetes.io/projected/a22ccfdd-695f-49fe-9bd9-5f1109915c63-kube-api-access-x4wnn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.482812 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.482867 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.482972 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4wnn\" (UniqueName: \"kubernetes.io/projected/a22ccfdd-695f-49fe-9bd9-5f1109915c63-kube-api-access-x4wnn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.482998 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.483030 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.488371 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.488386 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.488964 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.501573 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.515664 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4wnn\" (UniqueName: \"kubernetes.io/projected/a22ccfdd-695f-49fe-9bd9-5f1109915c63-kube-api-access-x4wnn\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:57 crc kubenswrapper[4807]: I1202 20:38:57.581924 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:38:58 crc kubenswrapper[4807]: I1202 20:38:58.283502 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf"] Dec 02 20:38:59 crc kubenswrapper[4807]: I1202 20:38:59.103650 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" event={"ID":"a22ccfdd-695f-49fe-9bd9-5f1109915c63","Type":"ContainerStarted","Data":"a639025b72fd4b0a76f289597c2de577357eafad230c08c7aa799027affe9b2b"} Dec 02 20:39:00 crc kubenswrapper[4807]: I1202 20:39:00.138854 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" event={"ID":"a22ccfdd-695f-49fe-9bd9-5f1109915c63","Type":"ContainerStarted","Data":"480f4091a7346bd9028d5d13c68abc13b8afaf15b081f6edcad906ac2bd9b985"} Dec 02 20:39:00 crc kubenswrapper[4807]: I1202 20:39:00.179704 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" podStartSLOduration=2.499853279 podStartE2EDuration="3.179673443s" podCreationTimestamp="2025-12-02 20:38:57 +0000 UTC" firstStartedPulling="2025-12-02 20:38:58.288050569 +0000 UTC m=+2473.588958064" lastFinishedPulling="2025-12-02 20:38:58.967870703 +0000 UTC m=+2474.268778228" observedRunningTime="2025-12-02 20:39:00.163655524 +0000 UTC m=+2475.464563039" watchObservedRunningTime="2025-12-02 20:39:00.179673443 +0000 UTC m=+2475.480580958" Dec 02 20:39:04 crc kubenswrapper[4807]: I1202 20:39:04.981744 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:39:04 crc kubenswrapper[4807]: E1202 20:39:04.982999 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:39:19 crc kubenswrapper[4807]: I1202 20:39:19.972550 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:39:19 crc kubenswrapper[4807]: E1202 20:39:19.973596 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:39:32 crc kubenswrapper[4807]: I1202 20:39:32.973589 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:39:32 crc kubenswrapper[4807]: E1202 20:39:32.974780 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:39:47 crc kubenswrapper[4807]: I1202 20:39:47.972537 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:39:47 crc kubenswrapper[4807]: E1202 20:39:47.973866 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:40:02 crc kubenswrapper[4807]: I1202 20:40:02.973172 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:40:02 crc kubenswrapper[4807]: E1202 20:40:02.974288 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:40:16 crc kubenswrapper[4807]: I1202 20:40:16.973575 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:40:16 crc kubenswrapper[4807]: E1202 20:40:16.975176 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:40:31 crc kubenswrapper[4807]: I1202 20:40:31.974526 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:40:31 crc kubenswrapper[4807]: E1202 20:40:31.975676 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:40:45 crc kubenswrapper[4807]: I1202 20:40:45.972567 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:40:45 crc kubenswrapper[4807]: E1202 20:40:45.973309 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:40:57 crc kubenswrapper[4807]: I1202 20:40:57.972837 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:40:57 crc kubenswrapper[4807]: E1202 20:40:57.973624 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:41:12 crc kubenswrapper[4807]: I1202 20:41:12.972146 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:41:13 crc kubenswrapper[4807]: I1202 20:41:13.803457 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"785326519a7daea2944ed656b5bd7e895cc747b5c4e1e45b2d32a47a0edc21d6"} Dec 02 20:43:17 crc kubenswrapper[4807]: I1202 20:43:17.773902 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5724x"] Dec 02 20:43:17 crc kubenswrapper[4807]: I1202 20:43:17.778516 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:17 crc kubenswrapper[4807]: I1202 20:43:17.783175 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5724x"] Dec 02 20:43:17 crc kubenswrapper[4807]: I1202 20:43:17.909814 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-catalog-content\") pod \"certified-operators-5724x\" (UID: \"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc\") " pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:17 crc kubenswrapper[4807]: I1202 20:43:17.909867 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-utilities\") pod \"certified-operators-5724x\" (UID: \"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc\") " pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:17 crc kubenswrapper[4807]: I1202 20:43:17.909903 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbsh2\" (UniqueName: \"kubernetes.io/projected/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-kube-api-access-tbsh2\") pod \"certified-operators-5724x\" (UID: \"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc\") " pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:18 crc kubenswrapper[4807]: I1202 20:43:18.011779 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-catalog-content\") pod \"certified-operators-5724x\" (UID: \"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc\") " pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:18 crc kubenswrapper[4807]: I1202 20:43:18.011837 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-utilities\") pod \"certified-operators-5724x\" (UID: \"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc\") " pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:18 crc kubenswrapper[4807]: I1202 20:43:18.011859 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbsh2\" (UniqueName: \"kubernetes.io/projected/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-kube-api-access-tbsh2\") pod \"certified-operators-5724x\" (UID: \"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc\") " pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:18 crc kubenswrapper[4807]: I1202 20:43:18.012617 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-catalog-content\") pod \"certified-operators-5724x\" (UID: \"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc\") " pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:18 crc kubenswrapper[4807]: I1202 20:43:18.012769 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-utilities\") pod \"certified-operators-5724x\" (UID: \"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc\") " pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:18 crc kubenswrapper[4807]: I1202 20:43:18.049114 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbsh2\" (UniqueName: \"kubernetes.io/projected/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-kube-api-access-tbsh2\") pod \"certified-operators-5724x\" (UID: \"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc\") " pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:18 crc kubenswrapper[4807]: I1202 20:43:18.149809 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:18 crc kubenswrapper[4807]: I1202 20:43:18.749346 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5724x"] Dec 02 20:43:19 crc kubenswrapper[4807]: I1202 20:43:19.589444 4807 generic.go:334] "Generic (PLEG): container finished" podID="a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc" containerID="a487afef639f7a37c18d856ad22348a13975d9485ef16a5ef29a71f67d13292a" exitCode=0 Dec 02 20:43:19 crc kubenswrapper[4807]: I1202 20:43:19.589780 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5724x" event={"ID":"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc","Type":"ContainerDied","Data":"a487afef639f7a37c18d856ad22348a13975d9485ef16a5ef29a71f67d13292a"} Dec 02 20:43:19 crc kubenswrapper[4807]: I1202 20:43:19.589820 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5724x" event={"ID":"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc","Type":"ContainerStarted","Data":"27e7de6972ea1086f83258c8caaee0e16c3b46e80aa19901b2c0e74cb6600a4c"} Dec 02 20:43:19 crc kubenswrapper[4807]: I1202 20:43:19.592196 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 20:43:20 crc kubenswrapper[4807]: I1202 20:43:20.605794 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5724x" event={"ID":"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc","Type":"ContainerStarted","Data":"449427f3c1360e373441579987f834a26dfe6355ec0150e5557a23683c6e9744"} Dec 02 20:43:21 crc kubenswrapper[4807]: I1202 20:43:21.618976 4807 generic.go:334] "Generic (PLEG): container finished" podID="a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc" containerID="449427f3c1360e373441579987f834a26dfe6355ec0150e5557a23683c6e9744" exitCode=0 Dec 02 20:43:21 crc kubenswrapper[4807]: I1202 20:43:21.619073 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5724x" event={"ID":"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc","Type":"ContainerDied","Data":"449427f3c1360e373441579987f834a26dfe6355ec0150e5557a23683c6e9744"} Dec 02 20:43:22 crc kubenswrapper[4807]: I1202 20:43:22.629303 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5724x" event={"ID":"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc","Type":"ContainerStarted","Data":"dac039cf86185ec11289c6396160ab3b0df528310bdb7cf8b7057e4f699e34d3"} Dec 02 20:43:22 crc kubenswrapper[4807]: I1202 20:43:22.668880 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5724x" podStartSLOduration=3.193771781 podStartE2EDuration="5.668761951s" podCreationTimestamp="2025-12-02 20:43:17 +0000 UTC" firstStartedPulling="2025-12-02 20:43:19.591832649 +0000 UTC m=+2734.892740174" lastFinishedPulling="2025-12-02 20:43:22.066822849 +0000 UTC m=+2737.367730344" observedRunningTime="2025-12-02 20:43:22.658698502 +0000 UTC m=+2737.959606007" watchObservedRunningTime="2025-12-02 20:43:22.668761951 +0000 UTC m=+2737.969669446" Dec 02 20:43:28 crc kubenswrapper[4807]: I1202 20:43:28.150946 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:28 crc kubenswrapper[4807]: I1202 20:43:28.151815 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:28 crc kubenswrapper[4807]: I1202 20:43:28.201584 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:28 crc kubenswrapper[4807]: I1202 20:43:28.293116 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:43:28 crc kubenswrapper[4807]: I1202 20:43:28.293212 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:43:28 crc kubenswrapper[4807]: I1202 20:43:28.741653 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:28 crc kubenswrapper[4807]: I1202 20:43:28.799091 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5724x"] Dec 02 20:43:30 crc kubenswrapper[4807]: I1202 20:43:30.712449 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5724x" podUID="a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc" containerName="registry-server" containerID="cri-o://dac039cf86185ec11289c6396160ab3b0df528310bdb7cf8b7057e4f699e34d3" gracePeriod=2 Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.206339 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.307213 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-catalog-content\") pod \"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc\" (UID: \"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc\") " Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.307390 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbsh2\" (UniqueName: \"kubernetes.io/projected/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-kube-api-access-tbsh2\") pod \"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc\" (UID: \"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc\") " Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.307476 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-utilities\") pod \"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc\" (UID: \"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc\") " Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.308622 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-utilities" (OuterVolumeSpecName: "utilities") pod "a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc" (UID: "a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.314645 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-kube-api-access-tbsh2" (OuterVolumeSpecName: "kube-api-access-tbsh2") pod "a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc" (UID: "a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc"). InnerVolumeSpecName "kube-api-access-tbsh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.409658 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.409693 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbsh2\" (UniqueName: \"kubernetes.io/projected/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-kube-api-access-tbsh2\") on node \"crc\" DevicePath \"\"" Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.571312 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc" (UID: "a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.613395 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.736238 4807 generic.go:334] "Generic (PLEG): container finished" podID="a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc" containerID="dac039cf86185ec11289c6396160ab3b0df528310bdb7cf8b7057e4f699e34d3" exitCode=0 Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.736285 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5724x" event={"ID":"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc","Type":"ContainerDied","Data":"dac039cf86185ec11289c6396160ab3b0df528310bdb7cf8b7057e4f699e34d3"} Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.736313 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5724x" event={"ID":"a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc","Type":"ContainerDied","Data":"27e7de6972ea1086f83258c8caaee0e16c3b46e80aa19901b2c0e74cb6600a4c"} Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.736340 4807 scope.go:117] "RemoveContainer" containerID="dac039cf86185ec11289c6396160ab3b0df528310bdb7cf8b7057e4f699e34d3" Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.736371 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5724x" Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.766573 4807 scope.go:117] "RemoveContainer" containerID="449427f3c1360e373441579987f834a26dfe6355ec0150e5557a23683c6e9744" Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.792886 4807 scope.go:117] "RemoveContainer" containerID="a487afef639f7a37c18d856ad22348a13975d9485ef16a5ef29a71f67d13292a" Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.796018 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5724x"] Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.804850 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5724x"] Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.847634 4807 scope.go:117] "RemoveContainer" containerID="dac039cf86185ec11289c6396160ab3b0df528310bdb7cf8b7057e4f699e34d3" Dec 02 20:43:31 crc kubenswrapper[4807]: E1202 20:43:31.848141 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dac039cf86185ec11289c6396160ab3b0df528310bdb7cf8b7057e4f699e34d3\": container with ID starting with dac039cf86185ec11289c6396160ab3b0df528310bdb7cf8b7057e4f699e34d3 not found: ID does not exist" containerID="dac039cf86185ec11289c6396160ab3b0df528310bdb7cf8b7057e4f699e34d3" Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.848178 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dac039cf86185ec11289c6396160ab3b0df528310bdb7cf8b7057e4f699e34d3"} err="failed to get container status \"dac039cf86185ec11289c6396160ab3b0df528310bdb7cf8b7057e4f699e34d3\": rpc error: code = NotFound desc = could not find container \"dac039cf86185ec11289c6396160ab3b0df528310bdb7cf8b7057e4f699e34d3\": container with ID starting with dac039cf86185ec11289c6396160ab3b0df528310bdb7cf8b7057e4f699e34d3 not found: ID does not exist" Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.848201 4807 scope.go:117] "RemoveContainer" containerID="449427f3c1360e373441579987f834a26dfe6355ec0150e5557a23683c6e9744" Dec 02 20:43:31 crc kubenswrapper[4807]: E1202 20:43:31.848562 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"449427f3c1360e373441579987f834a26dfe6355ec0150e5557a23683c6e9744\": container with ID starting with 449427f3c1360e373441579987f834a26dfe6355ec0150e5557a23683c6e9744 not found: ID does not exist" containerID="449427f3c1360e373441579987f834a26dfe6355ec0150e5557a23683c6e9744" Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.848582 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"449427f3c1360e373441579987f834a26dfe6355ec0150e5557a23683c6e9744"} err="failed to get container status \"449427f3c1360e373441579987f834a26dfe6355ec0150e5557a23683c6e9744\": rpc error: code = NotFound desc = could not find container \"449427f3c1360e373441579987f834a26dfe6355ec0150e5557a23683c6e9744\": container with ID starting with 449427f3c1360e373441579987f834a26dfe6355ec0150e5557a23683c6e9744 not found: ID does not exist" Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.848595 4807 scope.go:117] "RemoveContainer" containerID="a487afef639f7a37c18d856ad22348a13975d9485ef16a5ef29a71f67d13292a" Dec 02 20:43:31 crc kubenswrapper[4807]: E1202 20:43:31.848892 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a487afef639f7a37c18d856ad22348a13975d9485ef16a5ef29a71f67d13292a\": container with ID starting with a487afef639f7a37c18d856ad22348a13975d9485ef16a5ef29a71f67d13292a not found: ID does not exist" containerID="a487afef639f7a37c18d856ad22348a13975d9485ef16a5ef29a71f67d13292a" Dec 02 20:43:31 crc kubenswrapper[4807]: I1202 20:43:31.848912 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a487afef639f7a37c18d856ad22348a13975d9485ef16a5ef29a71f67d13292a"} err="failed to get container status \"a487afef639f7a37c18d856ad22348a13975d9485ef16a5ef29a71f67d13292a\": rpc error: code = NotFound desc = could not find container \"a487afef639f7a37c18d856ad22348a13975d9485ef16a5ef29a71f67d13292a\": container with ID starting with a487afef639f7a37c18d856ad22348a13975d9485ef16a5ef29a71f67d13292a not found: ID does not exist" Dec 02 20:43:32 crc kubenswrapper[4807]: I1202 20:43:32.984893 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc" path="/var/lib/kubelet/pods/a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc/volumes" Dec 02 20:43:54 crc kubenswrapper[4807]: I1202 20:43:54.991960 4807 generic.go:334] "Generic (PLEG): container finished" podID="a22ccfdd-695f-49fe-9bd9-5f1109915c63" containerID="480f4091a7346bd9028d5d13c68abc13b8afaf15b081f6edcad906ac2bd9b985" exitCode=0 Dec 02 20:43:54 crc kubenswrapper[4807]: I1202 20:43:54.992043 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" event={"ID":"a22ccfdd-695f-49fe-9bd9-5f1109915c63","Type":"ContainerDied","Data":"480f4091a7346bd9028d5d13c68abc13b8afaf15b081f6edcad906ac2bd9b985"} Dec 02 20:43:56 crc kubenswrapper[4807]: I1202 20:43:56.609507 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:43:56 crc kubenswrapper[4807]: I1202 20:43:56.709604 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-libvirt-secret-0\") pod \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " Dec 02 20:43:56 crc kubenswrapper[4807]: I1202 20:43:56.709684 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4wnn\" (UniqueName: \"kubernetes.io/projected/a22ccfdd-695f-49fe-9bd9-5f1109915c63-kube-api-access-x4wnn\") pod \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " Dec 02 20:43:56 crc kubenswrapper[4807]: I1202 20:43:56.709729 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-inventory\") pod \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " Dec 02 20:43:56 crc kubenswrapper[4807]: I1202 20:43:56.709774 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-libvirt-combined-ca-bundle\") pod \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " Dec 02 20:43:56 crc kubenswrapper[4807]: I1202 20:43:56.709914 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-ssh-key\") pod \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\" (UID: \"a22ccfdd-695f-49fe-9bd9-5f1109915c63\") " Dec 02 20:43:56 crc kubenswrapper[4807]: I1202 20:43:56.729518 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a22ccfdd-695f-49fe-9bd9-5f1109915c63-kube-api-access-x4wnn" (OuterVolumeSpecName: "kube-api-access-x4wnn") pod "a22ccfdd-695f-49fe-9bd9-5f1109915c63" (UID: "a22ccfdd-695f-49fe-9bd9-5f1109915c63"). InnerVolumeSpecName "kube-api-access-x4wnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:43:56 crc kubenswrapper[4807]: I1202 20:43:56.729653 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a22ccfdd-695f-49fe-9bd9-5f1109915c63" (UID: "a22ccfdd-695f-49fe-9bd9-5f1109915c63"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:43:56 crc kubenswrapper[4807]: I1202 20:43:56.739484 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a22ccfdd-695f-49fe-9bd9-5f1109915c63" (UID: "a22ccfdd-695f-49fe-9bd9-5f1109915c63"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:43:56 crc kubenswrapper[4807]: I1202 20:43:56.759884 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a22ccfdd-695f-49fe-9bd9-5f1109915c63" (UID: "a22ccfdd-695f-49fe-9bd9-5f1109915c63"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:43:56 crc kubenswrapper[4807]: I1202 20:43:56.760921 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-inventory" (OuterVolumeSpecName: "inventory") pod "a22ccfdd-695f-49fe-9bd9-5f1109915c63" (UID: "a22ccfdd-695f-49fe-9bd9-5f1109915c63"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:43:56 crc kubenswrapper[4807]: I1202 20:43:56.812196 4807 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:43:56 crc kubenswrapper[4807]: I1202 20:43:56.812252 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4wnn\" (UniqueName: \"kubernetes.io/projected/a22ccfdd-695f-49fe-9bd9-5f1109915c63-kube-api-access-x4wnn\") on node \"crc\" DevicePath \"\"" Dec 02 20:43:56 crc kubenswrapper[4807]: I1202 20:43:56.812269 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 20:43:56 crc kubenswrapper[4807]: I1202 20:43:56.812282 4807 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:43:56 crc kubenswrapper[4807]: I1202 20:43:56.812294 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a22ccfdd-695f-49fe-9bd9-5f1109915c63-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.016554 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" event={"ID":"a22ccfdd-695f-49fe-9bd9-5f1109915c63","Type":"ContainerDied","Data":"a639025b72fd4b0a76f289597c2de577357eafad230c08c7aa799027affe9b2b"} Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.016598 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a639025b72fd4b0a76f289597c2de577357eafad230c08c7aa799027affe9b2b" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.016703 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.121933 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r"] Dec 02 20:43:57 crc kubenswrapper[4807]: E1202 20:43:57.122331 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc" containerName="extract-content" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.122352 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc" containerName="extract-content" Dec 02 20:43:57 crc kubenswrapper[4807]: E1202 20:43:57.122381 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc" containerName="registry-server" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.122388 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc" containerName="registry-server" Dec 02 20:43:57 crc kubenswrapper[4807]: E1202 20:43:57.122400 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22ccfdd-695f-49fe-9bd9-5f1109915c63" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.122407 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22ccfdd-695f-49fe-9bd9-5f1109915c63" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 20:43:57 crc kubenswrapper[4807]: E1202 20:43:57.122435 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc" containerName="extract-utilities" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.122441 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc" containerName="extract-utilities" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.122629 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5f6eaa6-cc72-4297-b38a-5c62a8ece6fc" containerName="registry-server" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.122644 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="a22ccfdd-695f-49fe-9bd9-5f1109915c63" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.123298 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.126330 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.126368 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.126585 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.127022 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.127095 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.127146 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.130051 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.138621 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r"] Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.219820 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.219991 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.220102 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.220223 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.220323 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.220415 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.220647 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.220737 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sjt6\" (UniqueName: \"kubernetes.io/projected/c7cb6b66-35b2-477f-8d6a-3037a6931797-kube-api-access-9sjt6\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.220796 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.323527 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.324489 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.324798 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.325047 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.325439 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.325699 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sjt6\" (UniqueName: \"kubernetes.io/projected/c7cb6b66-35b2-477f-8d6a-3037a6931797-kube-api-access-9sjt6\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.325998 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.326262 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.326579 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.328080 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.333083 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.333659 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.336673 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.337975 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.340891 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.342994 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.344506 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.350808 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sjt6\" (UniqueName: \"kubernetes.io/projected/c7cb6b66-35b2-477f-8d6a-3037a6931797-kube-api-access-9sjt6\") pod \"nova-edpm-deployment-openstack-edpm-ipam-48b5r\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.442651 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:43:57 crc kubenswrapper[4807]: I1202 20:43:57.982026 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r"] Dec 02 20:43:57 crc kubenswrapper[4807]: W1202 20:43:57.987696 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7cb6b66_35b2_477f_8d6a_3037a6931797.slice/crio-fa032d8e828368f16c18124115d72a2a1f029fdbabd2ebb1449b76fee43dad68 WatchSource:0}: Error finding container fa032d8e828368f16c18124115d72a2a1f029fdbabd2ebb1449b76fee43dad68: Status 404 returned error can't find the container with id fa032d8e828368f16c18124115d72a2a1f029fdbabd2ebb1449b76fee43dad68 Dec 02 20:43:58 crc kubenswrapper[4807]: I1202 20:43:58.029595 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" event={"ID":"c7cb6b66-35b2-477f-8d6a-3037a6931797","Type":"ContainerStarted","Data":"fa032d8e828368f16c18124115d72a2a1f029fdbabd2ebb1449b76fee43dad68"} Dec 02 20:43:58 crc kubenswrapper[4807]: I1202 20:43:58.293703 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:43:58 crc kubenswrapper[4807]: I1202 20:43:58.293887 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:43:59 crc kubenswrapper[4807]: I1202 20:43:59.038426 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" event={"ID":"c7cb6b66-35b2-477f-8d6a-3037a6931797","Type":"ContainerStarted","Data":"b89d92036a3c5c6fdbb2d82f283ad5df424a6ae750edc5506dd266f973e0fd43"} Dec 02 20:43:59 crc kubenswrapper[4807]: I1202 20:43:59.067814 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" podStartSLOduration=1.533507118 podStartE2EDuration="2.067789022s" podCreationTimestamp="2025-12-02 20:43:57 +0000 UTC" firstStartedPulling="2025-12-02 20:43:57.990118605 +0000 UTC m=+2773.291026100" lastFinishedPulling="2025-12-02 20:43:58.524400509 +0000 UTC m=+2773.825308004" observedRunningTime="2025-12-02 20:43:59.05306045 +0000 UTC m=+2774.353967975" watchObservedRunningTime="2025-12-02 20:43:59.067789022 +0000 UTC m=+2774.368696527" Dec 02 20:44:28 crc kubenswrapper[4807]: I1202 20:44:28.292982 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:44:28 crc kubenswrapper[4807]: I1202 20:44:28.293540 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:44:28 crc kubenswrapper[4807]: I1202 20:44:28.293596 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 20:44:28 crc kubenswrapper[4807]: I1202 20:44:28.294363 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"785326519a7daea2944ed656b5bd7e895cc747b5c4e1e45b2d32a47a0edc21d6"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 20:44:28 crc kubenswrapper[4807]: I1202 20:44:28.294436 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://785326519a7daea2944ed656b5bd7e895cc747b5c4e1e45b2d32a47a0edc21d6" gracePeriod=600 Dec 02 20:44:29 crc kubenswrapper[4807]: I1202 20:44:29.354405 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="785326519a7daea2944ed656b5bd7e895cc747b5c4e1e45b2d32a47a0edc21d6" exitCode=0 Dec 02 20:44:29 crc kubenswrapper[4807]: I1202 20:44:29.354493 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"785326519a7daea2944ed656b5bd7e895cc747b5c4e1e45b2d32a47a0edc21d6"} Dec 02 20:44:29 crc kubenswrapper[4807]: I1202 20:44:29.355311 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae"} Dec 02 20:44:29 crc kubenswrapper[4807]: I1202 20:44:29.355352 4807 scope.go:117] "RemoveContainer" containerID="ee1eb2596fa86c5098b2eaa161f68c34a300ddc8955673b29481d22eadcb8bb4" Dec 02 20:45:00 crc kubenswrapper[4807]: I1202 20:45:00.154165 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8"] Dec 02 20:45:00 crc kubenswrapper[4807]: I1202 20:45:00.156437 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8" Dec 02 20:45:00 crc kubenswrapper[4807]: I1202 20:45:00.161847 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 20:45:00 crc kubenswrapper[4807]: I1202 20:45:00.162040 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 20:45:00 crc kubenswrapper[4807]: I1202 20:45:00.173097 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8"] Dec 02 20:45:00 crc kubenswrapper[4807]: I1202 20:45:00.202417 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-config-volume\") pod \"collect-profiles-29411805-gddq8\" (UID: \"d7d24dcd-2cef-49d6-8770-30f4051c4cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8" Dec 02 20:45:00 crc kubenswrapper[4807]: I1202 20:45:00.202838 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrnx2\" (UniqueName: \"kubernetes.io/projected/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-kube-api-access-xrnx2\") pod \"collect-profiles-29411805-gddq8\" (UID: \"d7d24dcd-2cef-49d6-8770-30f4051c4cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8" Dec 02 20:45:00 crc kubenswrapper[4807]: I1202 20:45:00.202982 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-secret-volume\") pod \"collect-profiles-29411805-gddq8\" (UID: \"d7d24dcd-2cef-49d6-8770-30f4051c4cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8" Dec 02 20:45:00 crc kubenswrapper[4807]: I1202 20:45:00.305877 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-config-volume\") pod \"collect-profiles-29411805-gddq8\" (UID: \"d7d24dcd-2cef-49d6-8770-30f4051c4cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8" Dec 02 20:45:00 crc kubenswrapper[4807]: I1202 20:45:00.306010 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrnx2\" (UniqueName: \"kubernetes.io/projected/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-kube-api-access-xrnx2\") pod \"collect-profiles-29411805-gddq8\" (UID: \"d7d24dcd-2cef-49d6-8770-30f4051c4cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8" Dec 02 20:45:00 crc kubenswrapper[4807]: I1202 20:45:00.306038 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-secret-volume\") pod \"collect-profiles-29411805-gddq8\" (UID: \"d7d24dcd-2cef-49d6-8770-30f4051c4cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8" Dec 02 20:45:00 crc kubenswrapper[4807]: I1202 20:45:00.310701 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-config-volume\") pod \"collect-profiles-29411805-gddq8\" (UID: \"d7d24dcd-2cef-49d6-8770-30f4051c4cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8" Dec 02 20:45:00 crc kubenswrapper[4807]: I1202 20:45:00.349552 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-secret-volume\") pod \"collect-profiles-29411805-gddq8\" (UID: \"d7d24dcd-2cef-49d6-8770-30f4051c4cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8" Dec 02 20:45:00 crc kubenswrapper[4807]: I1202 20:45:00.407276 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrnx2\" (UniqueName: \"kubernetes.io/projected/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-kube-api-access-xrnx2\") pod \"collect-profiles-29411805-gddq8\" (UID: \"d7d24dcd-2cef-49d6-8770-30f4051c4cf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8" Dec 02 20:45:00 crc kubenswrapper[4807]: I1202 20:45:00.486436 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8" Dec 02 20:45:01 crc kubenswrapper[4807]: I1202 20:45:01.036112 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8"] Dec 02 20:45:02 crc kubenswrapper[4807]: I1202 20:45:02.037515 4807 generic.go:334] "Generic (PLEG): container finished" podID="d7d24dcd-2cef-49d6-8770-30f4051c4cf5" containerID="8fa998dee75151db0096eb8857e1e77cde51172ec22cfd9c10953aba30b62159" exitCode=0 Dec 02 20:45:02 crc kubenswrapper[4807]: I1202 20:45:02.037590 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8" event={"ID":"d7d24dcd-2cef-49d6-8770-30f4051c4cf5","Type":"ContainerDied","Data":"8fa998dee75151db0096eb8857e1e77cde51172ec22cfd9c10953aba30b62159"} Dec 02 20:45:02 crc kubenswrapper[4807]: I1202 20:45:02.037635 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8" event={"ID":"d7d24dcd-2cef-49d6-8770-30f4051c4cf5","Type":"ContainerStarted","Data":"aad6b2eafc080aff9af8520efb258a2e9ae93c6047d92f62069b86be83d00728"} Dec 02 20:45:03 crc kubenswrapper[4807]: I1202 20:45:03.440100 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8" Dec 02 20:45:03 crc kubenswrapper[4807]: I1202 20:45:03.506740 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrnx2\" (UniqueName: \"kubernetes.io/projected/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-kube-api-access-xrnx2\") pod \"d7d24dcd-2cef-49d6-8770-30f4051c4cf5\" (UID: \"d7d24dcd-2cef-49d6-8770-30f4051c4cf5\") " Dec 02 20:45:03 crc kubenswrapper[4807]: I1202 20:45:03.506965 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-config-volume\") pod \"d7d24dcd-2cef-49d6-8770-30f4051c4cf5\" (UID: \"d7d24dcd-2cef-49d6-8770-30f4051c4cf5\") " Dec 02 20:45:03 crc kubenswrapper[4807]: I1202 20:45:03.507059 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-secret-volume\") pod \"d7d24dcd-2cef-49d6-8770-30f4051c4cf5\" (UID: \"d7d24dcd-2cef-49d6-8770-30f4051c4cf5\") " Dec 02 20:45:03 crc kubenswrapper[4807]: I1202 20:45:03.507530 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-config-volume" (OuterVolumeSpecName: "config-volume") pod "d7d24dcd-2cef-49d6-8770-30f4051c4cf5" (UID: "d7d24dcd-2cef-49d6-8770-30f4051c4cf5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:45:03 crc kubenswrapper[4807]: I1202 20:45:03.512020 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d7d24dcd-2cef-49d6-8770-30f4051c4cf5" (UID: "d7d24dcd-2cef-49d6-8770-30f4051c4cf5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:45:03 crc kubenswrapper[4807]: I1202 20:45:03.512184 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-kube-api-access-xrnx2" (OuterVolumeSpecName: "kube-api-access-xrnx2") pod "d7d24dcd-2cef-49d6-8770-30f4051c4cf5" (UID: "d7d24dcd-2cef-49d6-8770-30f4051c4cf5"). InnerVolumeSpecName "kube-api-access-xrnx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:45:03 crc kubenswrapper[4807]: I1202 20:45:03.609136 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrnx2\" (UniqueName: \"kubernetes.io/projected/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-kube-api-access-xrnx2\") on node \"crc\" DevicePath \"\"" Dec 02 20:45:03 crc kubenswrapper[4807]: I1202 20:45:03.609172 4807 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 20:45:03 crc kubenswrapper[4807]: I1202 20:45:03.609183 4807 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d7d24dcd-2cef-49d6-8770-30f4051c4cf5-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 20:45:04 crc kubenswrapper[4807]: I1202 20:45:04.060333 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8" event={"ID":"d7d24dcd-2cef-49d6-8770-30f4051c4cf5","Type":"ContainerDied","Data":"aad6b2eafc080aff9af8520efb258a2e9ae93c6047d92f62069b86be83d00728"} Dec 02 20:45:04 crc kubenswrapper[4807]: I1202 20:45:04.060781 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aad6b2eafc080aff9af8520efb258a2e9ae93c6047d92f62069b86be83d00728" Dec 02 20:45:04 crc kubenswrapper[4807]: I1202 20:45:04.060686 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8" Dec 02 20:45:04 crc kubenswrapper[4807]: I1202 20:45:04.530178 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6"] Dec 02 20:45:04 crc kubenswrapper[4807]: I1202 20:45:04.541395 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411760-dv4c6"] Dec 02 20:45:04 crc kubenswrapper[4807]: I1202 20:45:04.992971 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bc7f08d-6984-4bab-9220-761b68fdec0d" path="/var/lib/kubelet/pods/3bc7f08d-6984-4bab-9220-761b68fdec0d/volumes" Dec 02 20:45:11 crc kubenswrapper[4807]: I1202 20:45:11.935741 4807 scope.go:117] "RemoveContainer" containerID="1d0bedb05fdcad8ad84c571471aa13395eb036822ff5a066e63600cd3240a83c" Dec 02 20:45:47 crc kubenswrapper[4807]: I1202 20:45:47.491205 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j42w7"] Dec 02 20:45:47 crc kubenswrapper[4807]: E1202 20:45:47.492180 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d24dcd-2cef-49d6-8770-30f4051c4cf5" containerName="collect-profiles" Dec 02 20:45:47 crc kubenswrapper[4807]: I1202 20:45:47.492194 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d24dcd-2cef-49d6-8770-30f4051c4cf5" containerName="collect-profiles" Dec 02 20:45:47 crc kubenswrapper[4807]: I1202 20:45:47.492399 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d24dcd-2cef-49d6-8770-30f4051c4cf5" containerName="collect-profiles" Dec 02 20:45:47 crc kubenswrapper[4807]: I1202 20:45:47.498808 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:45:47 crc kubenswrapper[4807]: I1202 20:45:47.506073 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j42w7"] Dec 02 20:45:47 crc kubenswrapper[4807]: I1202 20:45:47.611107 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-catalog-content\") pod \"redhat-marketplace-j42w7\" (UID: \"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b\") " pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:45:47 crc kubenswrapper[4807]: I1202 20:45:47.611188 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-utilities\") pod \"redhat-marketplace-j42w7\" (UID: \"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b\") " pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:45:47 crc kubenswrapper[4807]: I1202 20:45:47.611399 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p96f4\" (UniqueName: \"kubernetes.io/projected/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-kube-api-access-p96f4\") pod \"redhat-marketplace-j42w7\" (UID: \"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b\") " pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:45:47 crc kubenswrapper[4807]: I1202 20:45:47.713947 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-catalog-content\") pod \"redhat-marketplace-j42w7\" (UID: \"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b\") " pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:45:47 crc kubenswrapper[4807]: I1202 20:45:47.714025 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-utilities\") pod \"redhat-marketplace-j42w7\" (UID: \"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b\") " pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:45:47 crc kubenswrapper[4807]: I1202 20:45:47.714077 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p96f4\" (UniqueName: \"kubernetes.io/projected/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-kube-api-access-p96f4\") pod \"redhat-marketplace-j42w7\" (UID: \"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b\") " pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:45:47 crc kubenswrapper[4807]: I1202 20:45:47.714507 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-catalog-content\") pod \"redhat-marketplace-j42w7\" (UID: \"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b\") " pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:45:47 crc kubenswrapper[4807]: I1202 20:45:47.714579 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-utilities\") pod \"redhat-marketplace-j42w7\" (UID: \"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b\") " pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:45:47 crc kubenswrapper[4807]: I1202 20:45:47.737217 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p96f4\" (UniqueName: \"kubernetes.io/projected/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-kube-api-access-p96f4\") pod \"redhat-marketplace-j42w7\" (UID: \"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b\") " pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:45:47 crc kubenswrapper[4807]: I1202 20:45:47.860589 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:45:48 crc kubenswrapper[4807]: I1202 20:45:48.355079 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j42w7"] Dec 02 20:45:48 crc kubenswrapper[4807]: I1202 20:45:48.611623 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j42w7" event={"ID":"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b","Type":"ContainerStarted","Data":"a6adea5a6ed4d97106f58fa8a56586f56e351354536e7c3aa6c0367c6f79a1d5"} Dec 02 20:45:48 crc kubenswrapper[4807]: I1202 20:45:48.611681 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j42w7" event={"ID":"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b","Type":"ContainerStarted","Data":"a996823efca23c152174cf70e51b32790f7482c3344ae91065487b4f0df42f71"} Dec 02 20:45:49 crc kubenswrapper[4807]: I1202 20:45:49.628663 4807 generic.go:334] "Generic (PLEG): container finished" podID="8a6bafe1-5b05-4ab1-8d25-bd7bf081421b" containerID="a6adea5a6ed4d97106f58fa8a56586f56e351354536e7c3aa6c0367c6f79a1d5" exitCode=0 Dec 02 20:45:49 crc kubenswrapper[4807]: I1202 20:45:49.628783 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j42w7" event={"ID":"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b","Type":"ContainerDied","Data":"a6adea5a6ed4d97106f58fa8a56586f56e351354536e7c3aa6c0367c6f79a1d5"} Dec 02 20:45:50 crc kubenswrapper[4807]: I1202 20:45:50.640409 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j42w7" event={"ID":"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b","Type":"ContainerStarted","Data":"c5de64349fbf7e720e819ef26423d86819daa795c6e9d826c146d49929b127bd"} Dec 02 20:45:51 crc kubenswrapper[4807]: I1202 20:45:51.658529 4807 generic.go:334] "Generic (PLEG): container finished" podID="8a6bafe1-5b05-4ab1-8d25-bd7bf081421b" containerID="c5de64349fbf7e720e819ef26423d86819daa795c6e9d826c146d49929b127bd" exitCode=0 Dec 02 20:45:51 crc kubenswrapper[4807]: I1202 20:45:51.658631 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j42w7" event={"ID":"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b","Type":"ContainerDied","Data":"c5de64349fbf7e720e819ef26423d86819daa795c6e9d826c146d49929b127bd"} Dec 02 20:45:52 crc kubenswrapper[4807]: I1202 20:45:52.671701 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j42w7" event={"ID":"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b","Type":"ContainerStarted","Data":"e20b6f2120c2b98c30ebe4d3fa2b76b15860a4a9d96f60af9b7c4c21742fdec9"} Dec 02 20:45:52 crc kubenswrapper[4807]: I1202 20:45:52.699730 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j42w7" podStartSLOduration=3.157886207 podStartE2EDuration="5.699695453s" podCreationTimestamp="2025-12-02 20:45:47 +0000 UTC" firstStartedPulling="2025-12-02 20:45:49.634434406 +0000 UTC m=+2884.935341931" lastFinishedPulling="2025-12-02 20:45:52.176243642 +0000 UTC m=+2887.477151177" observedRunningTime="2025-12-02 20:45:52.690288404 +0000 UTC m=+2887.991195909" watchObservedRunningTime="2025-12-02 20:45:52.699695453 +0000 UTC m=+2888.000602948" Dec 02 20:45:57 crc kubenswrapper[4807]: I1202 20:45:57.861153 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:45:57 crc kubenswrapper[4807]: I1202 20:45:57.861883 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:45:57 crc kubenswrapper[4807]: I1202 20:45:57.949763 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:45:58 crc kubenswrapper[4807]: I1202 20:45:58.840490 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:45:58 crc kubenswrapper[4807]: I1202 20:45:58.909314 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j42w7"] Dec 02 20:46:00 crc kubenswrapper[4807]: I1202 20:46:00.787494 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j42w7" podUID="8a6bafe1-5b05-4ab1-8d25-bd7bf081421b" containerName="registry-server" containerID="cri-o://e20b6f2120c2b98c30ebe4d3fa2b76b15860a4a9d96f60af9b7c4c21742fdec9" gracePeriod=2 Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.300902 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.433270 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-catalog-content\") pod \"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b\" (UID: \"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b\") " Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.433449 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-utilities\") pod \"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b\" (UID: \"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b\") " Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.433600 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p96f4\" (UniqueName: \"kubernetes.io/projected/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-kube-api-access-p96f4\") pod \"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b\" (UID: \"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b\") " Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.435550 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-utilities" (OuterVolumeSpecName: "utilities") pod "8a6bafe1-5b05-4ab1-8d25-bd7bf081421b" (UID: "8a6bafe1-5b05-4ab1-8d25-bd7bf081421b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.446996 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-kube-api-access-p96f4" (OuterVolumeSpecName: "kube-api-access-p96f4") pod "8a6bafe1-5b05-4ab1-8d25-bd7bf081421b" (UID: "8a6bafe1-5b05-4ab1-8d25-bd7bf081421b"). InnerVolumeSpecName "kube-api-access-p96f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.467689 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a6bafe1-5b05-4ab1-8d25-bd7bf081421b" (UID: "8a6bafe1-5b05-4ab1-8d25-bd7bf081421b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.536077 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.536119 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.536131 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p96f4\" (UniqueName: \"kubernetes.io/projected/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b-kube-api-access-p96f4\") on node \"crc\" DevicePath \"\"" Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.816994 4807 generic.go:334] "Generic (PLEG): container finished" podID="8a6bafe1-5b05-4ab1-8d25-bd7bf081421b" containerID="e20b6f2120c2b98c30ebe4d3fa2b76b15860a4a9d96f60af9b7c4c21742fdec9" exitCode=0 Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.817051 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j42w7" event={"ID":"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b","Type":"ContainerDied","Data":"e20b6f2120c2b98c30ebe4d3fa2b76b15860a4a9d96f60af9b7c4c21742fdec9"} Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.817083 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j42w7" event={"ID":"8a6bafe1-5b05-4ab1-8d25-bd7bf081421b","Type":"ContainerDied","Data":"a996823efca23c152174cf70e51b32790f7482c3344ae91065487b4f0df42f71"} Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.817105 4807 scope.go:117] "RemoveContainer" containerID="e20b6f2120c2b98c30ebe4d3fa2b76b15860a4a9d96f60af9b7c4c21742fdec9" Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.817618 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j42w7" Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.843002 4807 scope.go:117] "RemoveContainer" containerID="c5de64349fbf7e720e819ef26423d86819daa795c6e9d826c146d49929b127bd" Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.875831 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j42w7"] Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.885898 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j42w7"] Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.891206 4807 scope.go:117] "RemoveContainer" containerID="a6adea5a6ed4d97106f58fa8a56586f56e351354536e7c3aa6c0367c6f79a1d5" Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.928741 4807 scope.go:117] "RemoveContainer" containerID="e20b6f2120c2b98c30ebe4d3fa2b76b15860a4a9d96f60af9b7c4c21742fdec9" Dec 02 20:46:01 crc kubenswrapper[4807]: E1202 20:46:01.929370 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e20b6f2120c2b98c30ebe4d3fa2b76b15860a4a9d96f60af9b7c4c21742fdec9\": container with ID starting with e20b6f2120c2b98c30ebe4d3fa2b76b15860a4a9d96f60af9b7c4c21742fdec9 not found: ID does not exist" containerID="e20b6f2120c2b98c30ebe4d3fa2b76b15860a4a9d96f60af9b7c4c21742fdec9" Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.929415 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e20b6f2120c2b98c30ebe4d3fa2b76b15860a4a9d96f60af9b7c4c21742fdec9"} err="failed to get container status \"e20b6f2120c2b98c30ebe4d3fa2b76b15860a4a9d96f60af9b7c4c21742fdec9\": rpc error: code = NotFound desc = could not find container \"e20b6f2120c2b98c30ebe4d3fa2b76b15860a4a9d96f60af9b7c4c21742fdec9\": container with ID starting with e20b6f2120c2b98c30ebe4d3fa2b76b15860a4a9d96f60af9b7c4c21742fdec9 not found: ID does not exist" Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.929445 4807 scope.go:117] "RemoveContainer" containerID="c5de64349fbf7e720e819ef26423d86819daa795c6e9d826c146d49929b127bd" Dec 02 20:46:01 crc kubenswrapper[4807]: E1202 20:46:01.929871 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5de64349fbf7e720e819ef26423d86819daa795c6e9d826c146d49929b127bd\": container with ID starting with c5de64349fbf7e720e819ef26423d86819daa795c6e9d826c146d49929b127bd not found: ID does not exist" containerID="c5de64349fbf7e720e819ef26423d86819daa795c6e9d826c146d49929b127bd" Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.929901 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5de64349fbf7e720e819ef26423d86819daa795c6e9d826c146d49929b127bd"} err="failed to get container status \"c5de64349fbf7e720e819ef26423d86819daa795c6e9d826c146d49929b127bd\": rpc error: code = NotFound desc = could not find container \"c5de64349fbf7e720e819ef26423d86819daa795c6e9d826c146d49929b127bd\": container with ID starting with c5de64349fbf7e720e819ef26423d86819daa795c6e9d826c146d49929b127bd not found: ID does not exist" Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.929919 4807 scope.go:117] "RemoveContainer" containerID="a6adea5a6ed4d97106f58fa8a56586f56e351354536e7c3aa6c0367c6f79a1d5" Dec 02 20:46:01 crc kubenswrapper[4807]: E1202 20:46:01.930126 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6adea5a6ed4d97106f58fa8a56586f56e351354536e7c3aa6c0367c6f79a1d5\": container with ID starting with a6adea5a6ed4d97106f58fa8a56586f56e351354536e7c3aa6c0367c6f79a1d5 not found: ID does not exist" containerID="a6adea5a6ed4d97106f58fa8a56586f56e351354536e7c3aa6c0367c6f79a1d5" Dec 02 20:46:01 crc kubenswrapper[4807]: I1202 20:46:01.930156 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6adea5a6ed4d97106f58fa8a56586f56e351354536e7c3aa6c0367c6f79a1d5"} err="failed to get container status \"a6adea5a6ed4d97106f58fa8a56586f56e351354536e7c3aa6c0367c6f79a1d5\": rpc error: code = NotFound desc = could not find container \"a6adea5a6ed4d97106f58fa8a56586f56e351354536e7c3aa6c0367c6f79a1d5\": container with ID starting with a6adea5a6ed4d97106f58fa8a56586f56e351354536e7c3aa6c0367c6f79a1d5 not found: ID does not exist" Dec 02 20:46:02 crc kubenswrapper[4807]: I1202 20:46:02.983079 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a6bafe1-5b05-4ab1-8d25-bd7bf081421b" path="/var/lib/kubelet/pods/8a6bafe1-5b05-4ab1-8d25-bd7bf081421b/volumes" Dec 02 20:46:28 crc kubenswrapper[4807]: I1202 20:46:28.292996 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:46:28 crc kubenswrapper[4807]: I1202 20:46:28.293707 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.239619 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4j9rm"] Dec 02 20:46:33 crc kubenswrapper[4807]: E1202 20:46:33.240913 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a6bafe1-5b05-4ab1-8d25-bd7bf081421b" containerName="extract-content" Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.240939 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a6bafe1-5b05-4ab1-8d25-bd7bf081421b" containerName="extract-content" Dec 02 20:46:33 crc kubenswrapper[4807]: E1202 20:46:33.240964 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a6bafe1-5b05-4ab1-8d25-bd7bf081421b" containerName="registry-server" Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.240976 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a6bafe1-5b05-4ab1-8d25-bd7bf081421b" containerName="registry-server" Dec 02 20:46:33 crc kubenswrapper[4807]: E1202 20:46:33.241002 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a6bafe1-5b05-4ab1-8d25-bd7bf081421b" containerName="extract-utilities" Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.241016 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a6bafe1-5b05-4ab1-8d25-bd7bf081421b" containerName="extract-utilities" Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.241379 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a6bafe1-5b05-4ab1-8d25-bd7bf081421b" containerName="registry-server" Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.243860 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.263334 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4j9rm"] Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.412523 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb5b571-b90b-496e-bfa1-1ba54faebffc-catalog-content\") pod \"redhat-operators-4j9rm\" (UID: \"7eb5b571-b90b-496e-bfa1-1ba54faebffc\") " pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.412681 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkn5b\" (UniqueName: \"kubernetes.io/projected/7eb5b571-b90b-496e-bfa1-1ba54faebffc-kube-api-access-rkn5b\") pod \"redhat-operators-4j9rm\" (UID: \"7eb5b571-b90b-496e-bfa1-1ba54faebffc\") " pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.412729 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb5b571-b90b-496e-bfa1-1ba54faebffc-utilities\") pod \"redhat-operators-4j9rm\" (UID: \"7eb5b571-b90b-496e-bfa1-1ba54faebffc\") " pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.514302 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb5b571-b90b-496e-bfa1-1ba54faebffc-catalog-content\") pod \"redhat-operators-4j9rm\" (UID: \"7eb5b571-b90b-496e-bfa1-1ba54faebffc\") " pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.514467 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkn5b\" (UniqueName: \"kubernetes.io/projected/7eb5b571-b90b-496e-bfa1-1ba54faebffc-kube-api-access-rkn5b\") pod \"redhat-operators-4j9rm\" (UID: \"7eb5b571-b90b-496e-bfa1-1ba54faebffc\") " pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.514499 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb5b571-b90b-496e-bfa1-1ba54faebffc-utilities\") pod \"redhat-operators-4j9rm\" (UID: \"7eb5b571-b90b-496e-bfa1-1ba54faebffc\") " pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.514960 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb5b571-b90b-496e-bfa1-1ba54faebffc-catalog-content\") pod \"redhat-operators-4j9rm\" (UID: \"7eb5b571-b90b-496e-bfa1-1ba54faebffc\") " pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.515014 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb5b571-b90b-496e-bfa1-1ba54faebffc-utilities\") pod \"redhat-operators-4j9rm\" (UID: \"7eb5b571-b90b-496e-bfa1-1ba54faebffc\") " pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.532651 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkn5b\" (UniqueName: \"kubernetes.io/projected/7eb5b571-b90b-496e-bfa1-1ba54faebffc-kube-api-access-rkn5b\") pod \"redhat-operators-4j9rm\" (UID: \"7eb5b571-b90b-496e-bfa1-1ba54faebffc\") " pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:33 crc kubenswrapper[4807]: I1202 20:46:33.568375 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:34 crc kubenswrapper[4807]: I1202 20:46:34.068353 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4j9rm"] Dec 02 20:46:34 crc kubenswrapper[4807]: I1202 20:46:34.403900 4807 generic.go:334] "Generic (PLEG): container finished" podID="7eb5b571-b90b-496e-bfa1-1ba54faebffc" containerID="6cb3fc53227ec7a008830151d7252cc6de4a7cd4c32add9d0c4da42723d83f32" exitCode=0 Dec 02 20:46:34 crc kubenswrapper[4807]: I1202 20:46:34.404224 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4j9rm" event={"ID":"7eb5b571-b90b-496e-bfa1-1ba54faebffc","Type":"ContainerDied","Data":"6cb3fc53227ec7a008830151d7252cc6de4a7cd4c32add9d0c4da42723d83f32"} Dec 02 20:46:34 crc kubenswrapper[4807]: I1202 20:46:34.404256 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4j9rm" event={"ID":"7eb5b571-b90b-496e-bfa1-1ba54faebffc","Type":"ContainerStarted","Data":"e1b09fdb4b8b0db522210ac795d8f94f395c27afac7e8e6f152f300d6b73c76f"} Dec 02 20:46:35 crc kubenswrapper[4807]: I1202 20:46:35.419598 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4j9rm" event={"ID":"7eb5b571-b90b-496e-bfa1-1ba54faebffc","Type":"ContainerStarted","Data":"0cc2df32bfb9c3fa893d7f297c4c47c663f5748b830e9456e465d79ce76c559c"} Dec 02 20:46:38 crc kubenswrapper[4807]: I1202 20:46:38.464000 4807 generic.go:334] "Generic (PLEG): container finished" podID="7eb5b571-b90b-496e-bfa1-1ba54faebffc" containerID="0cc2df32bfb9c3fa893d7f297c4c47c663f5748b830e9456e465d79ce76c559c" exitCode=0 Dec 02 20:46:38 crc kubenswrapper[4807]: I1202 20:46:38.464096 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4j9rm" event={"ID":"7eb5b571-b90b-496e-bfa1-1ba54faebffc","Type":"ContainerDied","Data":"0cc2df32bfb9c3fa893d7f297c4c47c663f5748b830e9456e465d79ce76c559c"} Dec 02 20:46:39 crc kubenswrapper[4807]: I1202 20:46:39.477981 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4j9rm" event={"ID":"7eb5b571-b90b-496e-bfa1-1ba54faebffc","Type":"ContainerStarted","Data":"1b6df065af43f28d0e0378eff5e2c51505d782bb27444dda6b172dd309614747"} Dec 02 20:46:39 crc kubenswrapper[4807]: I1202 20:46:39.507261 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4j9rm" podStartSLOduration=1.971817758 podStartE2EDuration="6.507234833s" podCreationTimestamp="2025-12-02 20:46:33 +0000 UTC" firstStartedPulling="2025-12-02 20:46:34.40598176 +0000 UTC m=+2929.706889255" lastFinishedPulling="2025-12-02 20:46:38.941398825 +0000 UTC m=+2934.242306330" observedRunningTime="2025-12-02 20:46:39.499632296 +0000 UTC m=+2934.800539801" watchObservedRunningTime="2025-12-02 20:46:39.507234833 +0000 UTC m=+2934.808142328" Dec 02 20:46:43 crc kubenswrapper[4807]: I1202 20:46:43.569340 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:43 crc kubenswrapper[4807]: I1202 20:46:43.570071 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:44 crc kubenswrapper[4807]: I1202 20:46:44.626896 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4j9rm" podUID="7eb5b571-b90b-496e-bfa1-1ba54faebffc" containerName="registry-server" probeResult="failure" output=< Dec 02 20:46:44 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Dec 02 20:46:44 crc kubenswrapper[4807]: > Dec 02 20:46:53 crc kubenswrapper[4807]: I1202 20:46:53.624625 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:53 crc kubenswrapper[4807]: I1202 20:46:53.684637 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:53 crc kubenswrapper[4807]: I1202 20:46:53.862055 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4j9rm"] Dec 02 20:46:54 crc kubenswrapper[4807]: I1202 20:46:54.667954 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4j9rm" podUID="7eb5b571-b90b-496e-bfa1-1ba54faebffc" containerName="registry-server" containerID="cri-o://1b6df065af43f28d0e0378eff5e2c51505d782bb27444dda6b172dd309614747" gracePeriod=2 Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.231339 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.355651 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb5b571-b90b-496e-bfa1-1ba54faebffc-utilities\") pod \"7eb5b571-b90b-496e-bfa1-1ba54faebffc\" (UID: \"7eb5b571-b90b-496e-bfa1-1ba54faebffc\") " Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.355738 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkn5b\" (UniqueName: \"kubernetes.io/projected/7eb5b571-b90b-496e-bfa1-1ba54faebffc-kube-api-access-rkn5b\") pod \"7eb5b571-b90b-496e-bfa1-1ba54faebffc\" (UID: \"7eb5b571-b90b-496e-bfa1-1ba54faebffc\") " Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.355811 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb5b571-b90b-496e-bfa1-1ba54faebffc-catalog-content\") pod \"7eb5b571-b90b-496e-bfa1-1ba54faebffc\" (UID: \"7eb5b571-b90b-496e-bfa1-1ba54faebffc\") " Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.356911 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eb5b571-b90b-496e-bfa1-1ba54faebffc-utilities" (OuterVolumeSpecName: "utilities") pod "7eb5b571-b90b-496e-bfa1-1ba54faebffc" (UID: "7eb5b571-b90b-496e-bfa1-1ba54faebffc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.361813 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eb5b571-b90b-496e-bfa1-1ba54faebffc-kube-api-access-rkn5b" (OuterVolumeSpecName: "kube-api-access-rkn5b") pod "7eb5b571-b90b-496e-bfa1-1ba54faebffc" (UID: "7eb5b571-b90b-496e-bfa1-1ba54faebffc"). InnerVolumeSpecName "kube-api-access-rkn5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.458792 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7eb5b571-b90b-496e-bfa1-1ba54faebffc-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.458842 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkn5b\" (UniqueName: \"kubernetes.io/projected/7eb5b571-b90b-496e-bfa1-1ba54faebffc-kube-api-access-rkn5b\") on node \"crc\" DevicePath \"\"" Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.473561 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eb5b571-b90b-496e-bfa1-1ba54faebffc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7eb5b571-b90b-496e-bfa1-1ba54faebffc" (UID: "7eb5b571-b90b-496e-bfa1-1ba54faebffc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.561032 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7eb5b571-b90b-496e-bfa1-1ba54faebffc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.680499 4807 generic.go:334] "Generic (PLEG): container finished" podID="7eb5b571-b90b-496e-bfa1-1ba54faebffc" containerID="1b6df065af43f28d0e0378eff5e2c51505d782bb27444dda6b172dd309614747" exitCode=0 Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.680570 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4j9rm" event={"ID":"7eb5b571-b90b-496e-bfa1-1ba54faebffc","Type":"ContainerDied","Data":"1b6df065af43f28d0e0378eff5e2c51505d782bb27444dda6b172dd309614747"} Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.680613 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4j9rm" event={"ID":"7eb5b571-b90b-496e-bfa1-1ba54faebffc","Type":"ContainerDied","Data":"e1b09fdb4b8b0db522210ac795d8f94f395c27afac7e8e6f152f300d6b73c76f"} Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.680647 4807 scope.go:117] "RemoveContainer" containerID="1b6df065af43f28d0e0378eff5e2c51505d782bb27444dda6b172dd309614747" Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.680886 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4j9rm" Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.721886 4807 scope.go:117] "RemoveContainer" containerID="0cc2df32bfb9c3fa893d7f297c4c47c663f5748b830e9456e465d79ce76c559c" Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.731449 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4j9rm"] Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.753920 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4j9rm"] Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.758819 4807 scope.go:117] "RemoveContainer" containerID="6cb3fc53227ec7a008830151d7252cc6de4a7cd4c32add9d0c4da42723d83f32" Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.805318 4807 scope.go:117] "RemoveContainer" containerID="1b6df065af43f28d0e0378eff5e2c51505d782bb27444dda6b172dd309614747" Dec 02 20:46:55 crc kubenswrapper[4807]: E1202 20:46:55.805939 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b6df065af43f28d0e0378eff5e2c51505d782bb27444dda6b172dd309614747\": container with ID starting with 1b6df065af43f28d0e0378eff5e2c51505d782bb27444dda6b172dd309614747 not found: ID does not exist" containerID="1b6df065af43f28d0e0378eff5e2c51505d782bb27444dda6b172dd309614747" Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.805988 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6df065af43f28d0e0378eff5e2c51505d782bb27444dda6b172dd309614747"} err="failed to get container status \"1b6df065af43f28d0e0378eff5e2c51505d782bb27444dda6b172dd309614747\": rpc error: code = NotFound desc = could not find container \"1b6df065af43f28d0e0378eff5e2c51505d782bb27444dda6b172dd309614747\": container with ID starting with 1b6df065af43f28d0e0378eff5e2c51505d782bb27444dda6b172dd309614747 not found: ID does not exist" Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.806019 4807 scope.go:117] "RemoveContainer" containerID="0cc2df32bfb9c3fa893d7f297c4c47c663f5748b830e9456e465d79ce76c559c" Dec 02 20:46:55 crc kubenswrapper[4807]: E1202 20:46:55.806377 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc2df32bfb9c3fa893d7f297c4c47c663f5748b830e9456e465d79ce76c559c\": container with ID starting with 0cc2df32bfb9c3fa893d7f297c4c47c663f5748b830e9456e465d79ce76c559c not found: ID does not exist" containerID="0cc2df32bfb9c3fa893d7f297c4c47c663f5748b830e9456e465d79ce76c559c" Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.806412 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc2df32bfb9c3fa893d7f297c4c47c663f5748b830e9456e465d79ce76c559c"} err="failed to get container status \"0cc2df32bfb9c3fa893d7f297c4c47c663f5748b830e9456e465d79ce76c559c\": rpc error: code = NotFound desc = could not find container \"0cc2df32bfb9c3fa893d7f297c4c47c663f5748b830e9456e465d79ce76c559c\": container with ID starting with 0cc2df32bfb9c3fa893d7f297c4c47c663f5748b830e9456e465d79ce76c559c not found: ID does not exist" Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.806440 4807 scope.go:117] "RemoveContainer" containerID="6cb3fc53227ec7a008830151d7252cc6de4a7cd4c32add9d0c4da42723d83f32" Dec 02 20:46:55 crc kubenswrapper[4807]: E1202 20:46:55.809942 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cb3fc53227ec7a008830151d7252cc6de4a7cd4c32add9d0c4da42723d83f32\": container with ID starting with 6cb3fc53227ec7a008830151d7252cc6de4a7cd4c32add9d0c4da42723d83f32 not found: ID does not exist" containerID="6cb3fc53227ec7a008830151d7252cc6de4a7cd4c32add9d0c4da42723d83f32" Dec 02 20:46:55 crc kubenswrapper[4807]: I1202 20:46:55.809996 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cb3fc53227ec7a008830151d7252cc6de4a7cd4c32add9d0c4da42723d83f32"} err="failed to get container status \"6cb3fc53227ec7a008830151d7252cc6de4a7cd4c32add9d0c4da42723d83f32\": rpc error: code = NotFound desc = could not find container \"6cb3fc53227ec7a008830151d7252cc6de4a7cd4c32add9d0c4da42723d83f32\": container with ID starting with 6cb3fc53227ec7a008830151d7252cc6de4a7cd4c32add9d0c4da42723d83f32 not found: ID does not exist" Dec 02 20:46:57 crc kubenswrapper[4807]: I1202 20:46:57.002759 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eb5b571-b90b-496e-bfa1-1ba54faebffc" path="/var/lib/kubelet/pods/7eb5b571-b90b-496e-bfa1-1ba54faebffc/volumes" Dec 02 20:46:58 crc kubenswrapper[4807]: I1202 20:46:58.293092 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:46:58 crc kubenswrapper[4807]: I1202 20:46:58.293504 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:47:14 crc kubenswrapper[4807]: I1202 20:47:14.915399 4807 generic.go:334] "Generic (PLEG): container finished" podID="c7cb6b66-35b2-477f-8d6a-3037a6931797" containerID="b89d92036a3c5c6fdbb2d82f283ad5df424a6ae750edc5506dd266f973e0fd43" exitCode=0 Dec 02 20:47:14 crc kubenswrapper[4807]: I1202 20:47:14.915473 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" event={"ID":"c7cb6b66-35b2-477f-8d6a-3037a6931797","Type":"ContainerDied","Data":"b89d92036a3c5c6fdbb2d82f283ad5df424a6ae750edc5506dd266f973e0fd43"} Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.391803 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.538443 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-combined-ca-bundle\") pod \"c7cb6b66-35b2-477f-8d6a-3037a6931797\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.538556 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-extra-config-0\") pod \"c7cb6b66-35b2-477f-8d6a-3037a6931797\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.538579 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-migration-ssh-key-1\") pod \"c7cb6b66-35b2-477f-8d6a-3037a6931797\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.538617 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-cell1-compute-config-0\") pod \"c7cb6b66-35b2-477f-8d6a-3037a6931797\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.538637 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sjt6\" (UniqueName: \"kubernetes.io/projected/c7cb6b66-35b2-477f-8d6a-3037a6931797-kube-api-access-9sjt6\") pod \"c7cb6b66-35b2-477f-8d6a-3037a6931797\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.538704 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-migration-ssh-key-0\") pod \"c7cb6b66-35b2-477f-8d6a-3037a6931797\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.538806 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-cell1-compute-config-1\") pod \"c7cb6b66-35b2-477f-8d6a-3037a6931797\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.538839 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-ssh-key\") pod \"c7cb6b66-35b2-477f-8d6a-3037a6931797\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.538869 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-inventory\") pod \"c7cb6b66-35b2-477f-8d6a-3037a6931797\" (UID: \"c7cb6b66-35b2-477f-8d6a-3037a6931797\") " Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.546061 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7cb6b66-35b2-477f-8d6a-3037a6931797-kube-api-access-9sjt6" (OuterVolumeSpecName: "kube-api-access-9sjt6") pod "c7cb6b66-35b2-477f-8d6a-3037a6931797" (UID: "c7cb6b66-35b2-477f-8d6a-3037a6931797"). InnerVolumeSpecName "kube-api-access-9sjt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.548331 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "c7cb6b66-35b2-477f-8d6a-3037a6931797" (UID: "c7cb6b66-35b2-477f-8d6a-3037a6931797"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.577429 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "c7cb6b66-35b2-477f-8d6a-3037a6931797" (UID: "c7cb6b66-35b2-477f-8d6a-3037a6931797"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.581233 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-inventory" (OuterVolumeSpecName: "inventory") pod "c7cb6b66-35b2-477f-8d6a-3037a6931797" (UID: "c7cb6b66-35b2-477f-8d6a-3037a6931797"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.589584 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c7cb6b66-35b2-477f-8d6a-3037a6931797" (UID: "c7cb6b66-35b2-477f-8d6a-3037a6931797"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.590756 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "c7cb6b66-35b2-477f-8d6a-3037a6931797" (UID: "c7cb6b66-35b2-477f-8d6a-3037a6931797"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.599517 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "c7cb6b66-35b2-477f-8d6a-3037a6931797" (UID: "c7cb6b66-35b2-477f-8d6a-3037a6931797"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.600033 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "c7cb6b66-35b2-477f-8d6a-3037a6931797" (UID: "c7cb6b66-35b2-477f-8d6a-3037a6931797"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.603621 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "c7cb6b66-35b2-477f-8d6a-3037a6931797" (UID: "c7cb6b66-35b2-477f-8d6a-3037a6931797"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.642148 4807 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.642242 4807 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.642270 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.642299 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.642327 4807 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.642352 4807 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.642377 4807 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.642402 4807 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c7cb6b66-35b2-477f-8d6a-3037a6931797-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.642426 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sjt6\" (UniqueName: \"kubernetes.io/projected/c7cb6b66-35b2-477f-8d6a-3037a6931797-kube-api-access-9sjt6\") on node \"crc\" DevicePath \"\"" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.938764 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" event={"ID":"c7cb6b66-35b2-477f-8d6a-3037a6931797","Type":"ContainerDied","Data":"fa032d8e828368f16c18124115d72a2a1f029fdbabd2ebb1449b76fee43dad68"} Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.938811 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa032d8e828368f16c18124115d72a2a1f029fdbabd2ebb1449b76fee43dad68" Dec 02 20:47:16 crc kubenswrapper[4807]: I1202 20:47:16.938902 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-48b5r" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.078187 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg"] Dec 02 20:47:17 crc kubenswrapper[4807]: E1202 20:47:17.079009 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cb6b66-35b2-477f-8d6a-3037a6931797" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.079121 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cb6b66-35b2-477f-8d6a-3037a6931797" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 20:47:17 crc kubenswrapper[4807]: E1202 20:47:17.079229 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb5b571-b90b-496e-bfa1-1ba54faebffc" containerName="extract-content" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.079312 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb5b571-b90b-496e-bfa1-1ba54faebffc" containerName="extract-content" Dec 02 20:47:17 crc kubenswrapper[4807]: E1202 20:47:17.079426 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb5b571-b90b-496e-bfa1-1ba54faebffc" containerName="extract-utilities" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.079504 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb5b571-b90b-496e-bfa1-1ba54faebffc" containerName="extract-utilities" Dec 02 20:47:17 crc kubenswrapper[4807]: E1202 20:47:17.079585 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb5b571-b90b-496e-bfa1-1ba54faebffc" containerName="registry-server" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.079672 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb5b571-b90b-496e-bfa1-1ba54faebffc" containerName="registry-server" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.080069 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb5b571-b90b-496e-bfa1-1ba54faebffc" containerName="registry-server" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.080207 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cb6b66-35b2-477f-8d6a-3037a6931797" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.081189 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.084706 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.085276 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.087226 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-gx4j9" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.087280 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.087301 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.112441 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg"] Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.253948 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.254111 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.254157 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffk96\" (UniqueName: \"kubernetes.io/projected/c5be8c89-b466-4c89-aecd-548b6d5d19ae-kube-api-access-ffk96\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.254271 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.254315 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.254538 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.255177 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.357902 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.358013 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.358157 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.358981 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.359325 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.359376 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffk96\" (UniqueName: \"kubernetes.io/projected/c5be8c89-b466-4c89-aecd-548b6d5d19ae-kube-api-access-ffk96\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.359583 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.366453 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.366972 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.367452 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.367879 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.368889 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.369376 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.400083 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffk96\" (UniqueName: \"kubernetes.io/projected/c5be8c89-b466-4c89-aecd-548b6d5d19ae-kube-api-access-ffk96\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s86vg\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:17 crc kubenswrapper[4807]: I1202 20:47:17.700566 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:47:18 crc kubenswrapper[4807]: W1202 20:47:18.407062 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5be8c89_b466_4c89_aecd_548b6d5d19ae.slice/crio-6c9519af9cf54cde3ee5e11e10e85398b6f0806b67712cf3f770f37c597aaec5 WatchSource:0}: Error finding container 6c9519af9cf54cde3ee5e11e10e85398b6f0806b67712cf3f770f37c597aaec5: Status 404 returned error can't find the container with id 6c9519af9cf54cde3ee5e11e10e85398b6f0806b67712cf3f770f37c597aaec5 Dec 02 20:47:18 crc kubenswrapper[4807]: I1202 20:47:18.419368 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg"] Dec 02 20:47:18 crc kubenswrapper[4807]: I1202 20:47:18.993679 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" event={"ID":"c5be8c89-b466-4c89-aecd-548b6d5d19ae","Type":"ContainerStarted","Data":"6c9519af9cf54cde3ee5e11e10e85398b6f0806b67712cf3f770f37c597aaec5"} Dec 02 20:47:20 crc kubenswrapper[4807]: I1202 20:47:20.005861 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" event={"ID":"c5be8c89-b466-4c89-aecd-548b6d5d19ae","Type":"ContainerStarted","Data":"b6a33500d1adacfcd2acb564873a91c19ce233c9e87702f4330d5e25132289a3"} Dec 02 20:47:20 crc kubenswrapper[4807]: I1202 20:47:20.042147 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" podStartSLOduration=2.349166947 podStartE2EDuration="3.042112279s" podCreationTimestamp="2025-12-02 20:47:17 +0000 UTC" firstStartedPulling="2025-12-02 20:47:18.411199566 +0000 UTC m=+2973.712107071" lastFinishedPulling="2025-12-02 20:47:19.104144868 +0000 UTC m=+2974.405052403" observedRunningTime="2025-12-02 20:47:20.033092031 +0000 UTC m=+2975.333999596" watchObservedRunningTime="2025-12-02 20:47:20.042112279 +0000 UTC m=+2975.343019814" Dec 02 20:47:28 crc kubenswrapper[4807]: I1202 20:47:28.292549 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:47:28 crc kubenswrapper[4807]: I1202 20:47:28.293157 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:47:28 crc kubenswrapper[4807]: I1202 20:47:28.293219 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 20:47:28 crc kubenswrapper[4807]: I1202 20:47:28.294130 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 20:47:28 crc kubenswrapper[4807]: I1202 20:47:28.294187 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" gracePeriod=600 Dec 02 20:47:28 crc kubenswrapper[4807]: E1202 20:47:28.439856 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:47:29 crc kubenswrapper[4807]: I1202 20:47:29.161883 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" exitCode=0 Dec 02 20:47:29 crc kubenswrapper[4807]: I1202 20:47:29.161961 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae"} Dec 02 20:47:29 crc kubenswrapper[4807]: I1202 20:47:29.162220 4807 scope.go:117] "RemoveContainer" containerID="785326519a7daea2944ed656b5bd7e895cc747b5c4e1e45b2d32a47a0edc21d6" Dec 02 20:47:29 crc kubenswrapper[4807]: I1202 20:47:29.162907 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:47:29 crc kubenswrapper[4807]: E1202 20:47:29.163198 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:47:40 crc kubenswrapper[4807]: I1202 20:47:40.973156 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:47:40 crc kubenswrapper[4807]: E1202 20:47:40.974519 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:47:55 crc kubenswrapper[4807]: I1202 20:47:55.972238 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:47:55 crc kubenswrapper[4807]: E1202 20:47:55.973194 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:48:06 crc kubenswrapper[4807]: I1202 20:48:06.974667 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:48:06 crc kubenswrapper[4807]: E1202 20:48:06.975553 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:48:17 crc kubenswrapper[4807]: I1202 20:48:17.973470 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:48:17 crc kubenswrapper[4807]: E1202 20:48:17.974534 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:48:31 crc kubenswrapper[4807]: I1202 20:48:31.972561 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:48:31 crc kubenswrapper[4807]: E1202 20:48:31.973576 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:48:45 crc kubenswrapper[4807]: I1202 20:48:45.972442 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:48:45 crc kubenswrapper[4807]: E1202 20:48:45.973478 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:48:59 crc kubenswrapper[4807]: I1202 20:48:59.972920 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:48:59 crc kubenswrapper[4807]: E1202 20:48:59.974270 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:49:12 crc kubenswrapper[4807]: I1202 20:49:12.973470 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:49:12 crc kubenswrapper[4807]: E1202 20:49:12.974623 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:49:23 crc kubenswrapper[4807]: I1202 20:49:23.973273 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:49:23 crc kubenswrapper[4807]: E1202 20:49:23.975766 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:49:35 crc kubenswrapper[4807]: I1202 20:49:35.973639 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:49:35 crc kubenswrapper[4807]: E1202 20:49:35.974882 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:49:46 crc kubenswrapper[4807]: I1202 20:49:46.973576 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:49:46 crc kubenswrapper[4807]: E1202 20:49:46.974585 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:49:49 crc kubenswrapper[4807]: I1202 20:49:49.841780 4807 generic.go:334] "Generic (PLEG): container finished" podID="c5be8c89-b466-4c89-aecd-548b6d5d19ae" containerID="b6a33500d1adacfcd2acb564873a91c19ce233c9e87702f4330d5e25132289a3" exitCode=0 Dec 02 20:49:49 crc kubenswrapper[4807]: I1202 20:49:49.841823 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" event={"ID":"c5be8c89-b466-4c89-aecd-548b6d5d19ae","Type":"ContainerDied","Data":"b6a33500d1adacfcd2acb564873a91c19ce233c9e87702f4330d5e25132289a3"} Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.446069 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.495876 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffk96\" (UniqueName: \"kubernetes.io/projected/c5be8c89-b466-4c89-aecd-548b6d5d19ae-kube-api-access-ffk96\") pod \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.496028 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-inventory\") pod \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.496152 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-telemetry-combined-ca-bundle\") pod \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.496332 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ssh-key\") pod \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.496440 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-0\") pod \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.496565 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-2\") pod \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.496633 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-1\") pod \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\" (UID: \"c5be8c89-b466-4c89-aecd-548b6d5d19ae\") " Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.503492 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "c5be8c89-b466-4c89-aecd-548b6d5d19ae" (UID: "c5be8c89-b466-4c89-aecd-548b6d5d19ae"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.508421 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5be8c89-b466-4c89-aecd-548b6d5d19ae-kube-api-access-ffk96" (OuterVolumeSpecName: "kube-api-access-ffk96") pod "c5be8c89-b466-4c89-aecd-548b6d5d19ae" (UID: "c5be8c89-b466-4c89-aecd-548b6d5d19ae"). InnerVolumeSpecName "kube-api-access-ffk96". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.534525 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "c5be8c89-b466-4c89-aecd-548b6d5d19ae" (UID: "c5be8c89-b466-4c89-aecd-548b6d5d19ae"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.535293 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-inventory" (OuterVolumeSpecName: "inventory") pod "c5be8c89-b466-4c89-aecd-548b6d5d19ae" (UID: "c5be8c89-b466-4c89-aecd-548b6d5d19ae"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.551633 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "c5be8c89-b466-4c89-aecd-548b6d5d19ae" (UID: "c5be8c89-b466-4c89-aecd-548b6d5d19ae"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.552592 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c5be8c89-b466-4c89-aecd-548b6d5d19ae" (UID: "c5be8c89-b466-4c89-aecd-548b6d5d19ae"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.555633 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "c5be8c89-b466-4c89-aecd-548b6d5d19ae" (UID: "c5be8c89-b466-4c89-aecd-548b6d5d19ae"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.600011 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffk96\" (UniqueName: \"kubernetes.io/projected/c5be8c89-b466-4c89-aecd-548b6d5d19ae-kube-api-access-ffk96\") on node \"crc\" DevicePath \"\"" Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.600051 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.600063 4807 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.600072 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.600083 4807 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.600094 4807 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.600103 4807 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5be8c89-b466-4c89-aecd-548b6d5d19ae-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.865398 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" event={"ID":"c5be8c89-b466-4c89-aecd-548b6d5d19ae","Type":"ContainerDied","Data":"6c9519af9cf54cde3ee5e11e10e85398b6f0806b67712cf3f770f37c597aaec5"} Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.865448 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c9519af9cf54cde3ee5e11e10e85398b6f0806b67712cf3f770f37c597aaec5" Dec 02 20:49:51 crc kubenswrapper[4807]: I1202 20:49:51.865501 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s86vg" Dec 02 20:49:57 crc kubenswrapper[4807]: I1202 20:49:57.974278 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:49:57 crc kubenswrapper[4807]: E1202 20:49:57.975656 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:50:10 crc kubenswrapper[4807]: I1202 20:50:10.973027 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:50:10 crc kubenswrapper[4807]: E1202 20:50:10.974104 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:50:14 crc kubenswrapper[4807]: I1202 20:50:14.677144 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bjxk8"] Dec 02 20:50:14 crc kubenswrapper[4807]: E1202 20:50:14.678492 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5be8c89-b466-4c89-aecd-548b6d5d19ae" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 20:50:14 crc kubenswrapper[4807]: I1202 20:50:14.678515 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5be8c89-b466-4c89-aecd-548b6d5d19ae" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 20:50:14 crc kubenswrapper[4807]: I1202 20:50:14.678831 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5be8c89-b466-4c89-aecd-548b6d5d19ae" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 20:50:14 crc kubenswrapper[4807]: I1202 20:50:14.680954 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:14 crc kubenswrapper[4807]: I1202 20:50:14.702046 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bjxk8"] Dec 02 20:50:14 crc kubenswrapper[4807]: I1202 20:50:14.744488 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvnw7\" (UniqueName: \"kubernetes.io/projected/10b4b764-c084-45b9-9490-dc587c8ef441-kube-api-access-zvnw7\") pod \"community-operators-bjxk8\" (UID: \"10b4b764-c084-45b9-9490-dc587c8ef441\") " pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:14 crc kubenswrapper[4807]: I1202 20:50:14.744610 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b4b764-c084-45b9-9490-dc587c8ef441-utilities\") pod \"community-operators-bjxk8\" (UID: \"10b4b764-c084-45b9-9490-dc587c8ef441\") " pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:14 crc kubenswrapper[4807]: I1202 20:50:14.744896 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b4b764-c084-45b9-9490-dc587c8ef441-catalog-content\") pod \"community-operators-bjxk8\" (UID: \"10b4b764-c084-45b9-9490-dc587c8ef441\") " pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:14 crc kubenswrapper[4807]: I1202 20:50:14.845876 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvnw7\" (UniqueName: \"kubernetes.io/projected/10b4b764-c084-45b9-9490-dc587c8ef441-kube-api-access-zvnw7\") pod \"community-operators-bjxk8\" (UID: \"10b4b764-c084-45b9-9490-dc587c8ef441\") " pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:14 crc kubenswrapper[4807]: I1202 20:50:14.845927 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b4b764-c084-45b9-9490-dc587c8ef441-utilities\") pod \"community-operators-bjxk8\" (UID: \"10b4b764-c084-45b9-9490-dc587c8ef441\") " pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:14 crc kubenswrapper[4807]: I1202 20:50:14.846005 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b4b764-c084-45b9-9490-dc587c8ef441-catalog-content\") pod \"community-operators-bjxk8\" (UID: \"10b4b764-c084-45b9-9490-dc587c8ef441\") " pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:14 crc kubenswrapper[4807]: I1202 20:50:14.846638 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b4b764-c084-45b9-9490-dc587c8ef441-catalog-content\") pod \"community-operators-bjxk8\" (UID: \"10b4b764-c084-45b9-9490-dc587c8ef441\") " pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:14 crc kubenswrapper[4807]: I1202 20:50:14.846656 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b4b764-c084-45b9-9490-dc587c8ef441-utilities\") pod \"community-operators-bjxk8\" (UID: \"10b4b764-c084-45b9-9490-dc587c8ef441\") " pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:14 crc kubenswrapper[4807]: I1202 20:50:14.892811 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvnw7\" (UniqueName: \"kubernetes.io/projected/10b4b764-c084-45b9-9490-dc587c8ef441-kube-api-access-zvnw7\") pod \"community-operators-bjxk8\" (UID: \"10b4b764-c084-45b9-9490-dc587c8ef441\") " pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:15 crc kubenswrapper[4807]: I1202 20:50:15.070141 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:15 crc kubenswrapper[4807]: I1202 20:50:15.590643 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bjxk8"] Dec 02 20:50:15 crc kubenswrapper[4807]: W1202 20:50:15.606817 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10b4b764_c084_45b9_9490_dc587c8ef441.slice/crio-8ee95b6bc32ed0e9b4984f96826b980bb5d24a016c98c18d3784760759d39175 WatchSource:0}: Error finding container 8ee95b6bc32ed0e9b4984f96826b980bb5d24a016c98c18d3784760759d39175: Status 404 returned error can't find the container with id 8ee95b6bc32ed0e9b4984f96826b980bb5d24a016c98c18d3784760759d39175 Dec 02 20:50:16 crc kubenswrapper[4807]: I1202 20:50:16.175690 4807 generic.go:334] "Generic (PLEG): container finished" podID="10b4b764-c084-45b9-9490-dc587c8ef441" containerID="b76e07bae3a70a7fd2a4bf1027fbaa21f9c846b853564f9356888cefe0f16937" exitCode=0 Dec 02 20:50:16 crc kubenswrapper[4807]: I1202 20:50:16.175821 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjxk8" event={"ID":"10b4b764-c084-45b9-9490-dc587c8ef441","Type":"ContainerDied","Data":"b76e07bae3a70a7fd2a4bf1027fbaa21f9c846b853564f9356888cefe0f16937"} Dec 02 20:50:16 crc kubenswrapper[4807]: I1202 20:50:16.176117 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjxk8" event={"ID":"10b4b764-c084-45b9-9490-dc587c8ef441","Type":"ContainerStarted","Data":"8ee95b6bc32ed0e9b4984f96826b980bb5d24a016c98c18d3784760759d39175"} Dec 02 20:50:16 crc kubenswrapper[4807]: I1202 20:50:16.177903 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 20:50:18 crc kubenswrapper[4807]: I1202 20:50:18.195734 4807 generic.go:334] "Generic (PLEG): container finished" podID="10b4b764-c084-45b9-9490-dc587c8ef441" containerID="30d262bf96c6972a8898a6407171afb8d75512618429190121c385fffddd1eb1" exitCode=0 Dec 02 20:50:18 crc kubenswrapper[4807]: I1202 20:50:18.195982 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjxk8" event={"ID":"10b4b764-c084-45b9-9490-dc587c8ef441","Type":"ContainerDied","Data":"30d262bf96c6972a8898a6407171afb8d75512618429190121c385fffddd1eb1"} Dec 02 20:50:19 crc kubenswrapper[4807]: I1202 20:50:19.213495 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjxk8" event={"ID":"10b4b764-c084-45b9-9490-dc587c8ef441","Type":"ContainerStarted","Data":"0db484237270cd6ba3247c48af1995c7d85a7e52794ef1150ccbe9dba2a64efe"} Dec 02 20:50:21 crc kubenswrapper[4807]: I1202 20:50:21.973069 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:50:21 crc kubenswrapper[4807]: E1202 20:50:21.975526 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:50:25 crc kubenswrapper[4807]: I1202 20:50:25.070445 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:25 crc kubenswrapper[4807]: I1202 20:50:25.070748 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:25 crc kubenswrapper[4807]: I1202 20:50:25.128742 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:25 crc kubenswrapper[4807]: I1202 20:50:25.163435 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bjxk8" podStartSLOduration=8.624571445 podStartE2EDuration="11.16340633s" podCreationTimestamp="2025-12-02 20:50:14 +0000 UTC" firstStartedPulling="2025-12-02 20:50:16.177549972 +0000 UTC m=+3151.478457457" lastFinishedPulling="2025-12-02 20:50:18.716384847 +0000 UTC m=+3154.017292342" observedRunningTime="2025-12-02 20:50:19.239445944 +0000 UTC m=+3154.540353449" watchObservedRunningTime="2025-12-02 20:50:25.16340633 +0000 UTC m=+3160.464313855" Dec 02 20:50:25 crc kubenswrapper[4807]: I1202 20:50:25.341856 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:25 crc kubenswrapper[4807]: I1202 20:50:25.401646 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bjxk8"] Dec 02 20:50:27 crc kubenswrapper[4807]: I1202 20:50:27.309358 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bjxk8" podUID="10b4b764-c084-45b9-9490-dc587c8ef441" containerName="registry-server" containerID="cri-o://0db484237270cd6ba3247c48af1995c7d85a7e52794ef1150ccbe9dba2a64efe" gracePeriod=2 Dec 02 20:50:27 crc kubenswrapper[4807]: I1202 20:50:27.959509 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.058690 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvnw7\" (UniqueName: \"kubernetes.io/projected/10b4b764-c084-45b9-9490-dc587c8ef441-kube-api-access-zvnw7\") pod \"10b4b764-c084-45b9-9490-dc587c8ef441\" (UID: \"10b4b764-c084-45b9-9490-dc587c8ef441\") " Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.058978 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b4b764-c084-45b9-9490-dc587c8ef441-catalog-content\") pod \"10b4b764-c084-45b9-9490-dc587c8ef441\" (UID: \"10b4b764-c084-45b9-9490-dc587c8ef441\") " Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.059222 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b4b764-c084-45b9-9490-dc587c8ef441-utilities\") pod \"10b4b764-c084-45b9-9490-dc587c8ef441\" (UID: \"10b4b764-c084-45b9-9490-dc587c8ef441\") " Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.060160 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b4b764-c084-45b9-9490-dc587c8ef441-utilities" (OuterVolumeSpecName: "utilities") pod "10b4b764-c084-45b9-9490-dc587c8ef441" (UID: "10b4b764-c084-45b9-9490-dc587c8ef441"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.065984 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b4b764-c084-45b9-9490-dc587c8ef441-kube-api-access-zvnw7" (OuterVolumeSpecName: "kube-api-access-zvnw7") pod "10b4b764-c084-45b9-9490-dc587c8ef441" (UID: "10b4b764-c084-45b9-9490-dc587c8ef441"). InnerVolumeSpecName "kube-api-access-zvnw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.119826 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10b4b764-c084-45b9-9490-dc587c8ef441-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10b4b764-c084-45b9-9490-dc587c8ef441" (UID: "10b4b764-c084-45b9-9490-dc587c8ef441"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.162286 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10b4b764-c084-45b9-9490-dc587c8ef441-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.162327 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvnw7\" (UniqueName: \"kubernetes.io/projected/10b4b764-c084-45b9-9490-dc587c8ef441-kube-api-access-zvnw7\") on node \"crc\" DevicePath \"\"" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.162340 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10b4b764-c084-45b9-9490-dc587c8ef441-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.324047 4807 generic.go:334] "Generic (PLEG): container finished" podID="10b4b764-c084-45b9-9490-dc587c8ef441" containerID="0db484237270cd6ba3247c48af1995c7d85a7e52794ef1150ccbe9dba2a64efe" exitCode=0 Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.324102 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjxk8" event={"ID":"10b4b764-c084-45b9-9490-dc587c8ef441","Type":"ContainerDied","Data":"0db484237270cd6ba3247c48af1995c7d85a7e52794ef1150ccbe9dba2a64efe"} Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.324149 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjxk8" event={"ID":"10b4b764-c084-45b9-9490-dc587c8ef441","Type":"ContainerDied","Data":"8ee95b6bc32ed0e9b4984f96826b980bb5d24a016c98c18d3784760759d39175"} Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.324172 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjxk8" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.324204 4807 scope.go:117] "RemoveContainer" containerID="0db484237270cd6ba3247c48af1995c7d85a7e52794ef1150ccbe9dba2a64efe" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.353757 4807 scope.go:117] "RemoveContainer" containerID="30d262bf96c6972a8898a6407171afb8d75512618429190121c385fffddd1eb1" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.381125 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bjxk8"] Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.395522 4807 scope.go:117] "RemoveContainer" containerID="b76e07bae3a70a7fd2a4bf1027fbaa21f9c846b853564f9356888cefe0f16937" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.398089 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bjxk8"] Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.449682 4807 scope.go:117] "RemoveContainer" containerID="0db484237270cd6ba3247c48af1995c7d85a7e52794ef1150ccbe9dba2a64efe" Dec 02 20:50:28 crc kubenswrapper[4807]: E1202 20:50:28.451135 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0db484237270cd6ba3247c48af1995c7d85a7e52794ef1150ccbe9dba2a64efe\": container with ID starting with 0db484237270cd6ba3247c48af1995c7d85a7e52794ef1150ccbe9dba2a64efe not found: ID does not exist" containerID="0db484237270cd6ba3247c48af1995c7d85a7e52794ef1150ccbe9dba2a64efe" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.451183 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0db484237270cd6ba3247c48af1995c7d85a7e52794ef1150ccbe9dba2a64efe"} err="failed to get container status \"0db484237270cd6ba3247c48af1995c7d85a7e52794ef1150ccbe9dba2a64efe\": rpc error: code = NotFound desc = could not find container \"0db484237270cd6ba3247c48af1995c7d85a7e52794ef1150ccbe9dba2a64efe\": container with ID starting with 0db484237270cd6ba3247c48af1995c7d85a7e52794ef1150ccbe9dba2a64efe not found: ID does not exist" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.451213 4807 scope.go:117] "RemoveContainer" containerID="30d262bf96c6972a8898a6407171afb8d75512618429190121c385fffddd1eb1" Dec 02 20:50:28 crc kubenswrapper[4807]: E1202 20:50:28.451677 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30d262bf96c6972a8898a6407171afb8d75512618429190121c385fffddd1eb1\": container with ID starting with 30d262bf96c6972a8898a6407171afb8d75512618429190121c385fffddd1eb1 not found: ID does not exist" containerID="30d262bf96c6972a8898a6407171afb8d75512618429190121c385fffddd1eb1" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.451734 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30d262bf96c6972a8898a6407171afb8d75512618429190121c385fffddd1eb1"} err="failed to get container status \"30d262bf96c6972a8898a6407171afb8d75512618429190121c385fffddd1eb1\": rpc error: code = NotFound desc = could not find container \"30d262bf96c6972a8898a6407171afb8d75512618429190121c385fffddd1eb1\": container with ID starting with 30d262bf96c6972a8898a6407171afb8d75512618429190121c385fffddd1eb1 not found: ID does not exist" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.451761 4807 scope.go:117] "RemoveContainer" containerID="b76e07bae3a70a7fd2a4bf1027fbaa21f9c846b853564f9356888cefe0f16937" Dec 02 20:50:28 crc kubenswrapper[4807]: E1202 20:50:28.462477 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b76e07bae3a70a7fd2a4bf1027fbaa21f9c846b853564f9356888cefe0f16937\": container with ID starting with b76e07bae3a70a7fd2a4bf1027fbaa21f9c846b853564f9356888cefe0f16937 not found: ID does not exist" containerID="b76e07bae3a70a7fd2a4bf1027fbaa21f9c846b853564f9356888cefe0f16937" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.462537 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b76e07bae3a70a7fd2a4bf1027fbaa21f9c846b853564f9356888cefe0f16937"} err="failed to get container status \"b76e07bae3a70a7fd2a4bf1027fbaa21f9c846b853564f9356888cefe0f16937\": rpc error: code = NotFound desc = could not find container \"b76e07bae3a70a7fd2a4bf1027fbaa21f9c846b853564f9356888cefe0f16937\": container with ID starting with b76e07bae3a70a7fd2a4bf1027fbaa21f9c846b853564f9356888cefe0f16937 not found: ID does not exist" Dec 02 20:50:28 crc kubenswrapper[4807]: I1202 20:50:28.990141 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10b4b764-c084-45b9-9490-dc587c8ef441" path="/var/lib/kubelet/pods/10b4b764-c084-45b9-9490-dc587c8ef441/volumes" Dec 02 20:50:30 crc kubenswrapper[4807]: I1202 20:50:30.155946 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 20:50:30 crc kubenswrapper[4807]: I1202 20:50:30.156707 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a52102ed-4584-44cb-b051-4f08f862d64d" containerName="prometheus" containerID="cri-o://7327b5e7d54721e1df6f00230469ab9101807546a999fae341c6b722873e27b7" gracePeriod=600 Dec 02 20:50:30 crc kubenswrapper[4807]: I1202 20:50:30.156887 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a52102ed-4584-44cb-b051-4f08f862d64d" containerName="thanos-sidecar" containerID="cri-o://57df907f8d5ec55e94958ff44379306e20d406fcfae987fd7aa5e8b49ccc8e5b" gracePeriod=600 Dec 02 20:50:30 crc kubenswrapper[4807]: I1202 20:50:30.156921 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a52102ed-4584-44cb-b051-4f08f862d64d" containerName="config-reloader" containerID="cri-o://a363a8c00fb72da0b549f2f69fa6ae74750dbced890a1d67e834fb32a98f2ad9" gracePeriod=600 Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.361490 4807 generic.go:334] "Generic (PLEG): container finished" podID="a52102ed-4584-44cb-b051-4f08f862d64d" containerID="57df907f8d5ec55e94958ff44379306e20d406fcfae987fd7aa5e8b49ccc8e5b" exitCode=0 Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.361873 4807 generic.go:334] "Generic (PLEG): container finished" podID="a52102ed-4584-44cb-b051-4f08f862d64d" containerID="7327b5e7d54721e1df6f00230469ab9101807546a999fae341c6b722873e27b7" exitCode=0 Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.361616 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a52102ed-4584-44cb-b051-4f08f862d64d","Type":"ContainerDied","Data":"57df907f8d5ec55e94958ff44379306e20d406fcfae987fd7aa5e8b49ccc8e5b"} Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.361962 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a52102ed-4584-44cb-b051-4f08f862d64d","Type":"ContainerDied","Data":"7327b5e7d54721e1df6f00230469ab9101807546a999fae341c6b722873e27b7"} Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.400559 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="a52102ed-4584-44cb-b051-4f08f862d64d" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.142:9090/-/ready\": dial tcp 10.217.0.142:9090: connect: connection refused" Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.903939 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.943346 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"a52102ed-4584-44cb-b051-4f08f862d64d\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.943390 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config\") pod \"a52102ed-4584-44cb-b051-4f08f862d64d\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.943458 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-thanos-prometheus-http-client-file\") pod \"a52102ed-4584-44cb-b051-4f08f862d64d\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.943506 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-config\") pod \"a52102ed-4584-44cb-b051-4f08f862d64d\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.943534 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a52102ed-4584-44cb-b051-4f08f862d64d-config-out\") pod \"a52102ed-4584-44cb-b051-4f08f862d64d\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.943596 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m4vb\" (UniqueName: \"kubernetes.io/projected/a52102ed-4584-44cb-b051-4f08f862d64d-kube-api-access-7m4vb\") pod \"a52102ed-4584-44cb-b051-4f08f862d64d\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.943625 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"a52102ed-4584-44cb-b051-4f08f862d64d\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.943648 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a52102ed-4584-44cb-b051-4f08f862d64d-tls-assets\") pod \"a52102ed-4584-44cb-b051-4f08f862d64d\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.943909 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") pod \"a52102ed-4584-44cb-b051-4f08f862d64d\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.943978 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a52102ed-4584-44cb-b051-4f08f862d64d-prometheus-metric-storage-rulefiles-0\") pod \"a52102ed-4584-44cb-b051-4f08f862d64d\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.944009 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-secret-combined-ca-bundle\") pod \"a52102ed-4584-44cb-b051-4f08f862d64d\" (UID: \"a52102ed-4584-44cb-b051-4f08f862d64d\") " Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.945122 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52102ed-4584-44cb-b051-4f08f862d64d-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "a52102ed-4584-44cb-b051-4f08f862d64d" (UID: "a52102ed-4584-44cb-b051-4f08f862d64d"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.953556 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52102ed-4584-44cb-b051-4f08f862d64d-kube-api-access-7m4vb" (OuterVolumeSpecName: "kube-api-access-7m4vb") pod "a52102ed-4584-44cb-b051-4f08f862d64d" (UID: "a52102ed-4584-44cb-b051-4f08f862d64d"). InnerVolumeSpecName "kube-api-access-7m4vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.955739 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "a52102ed-4584-44cb-b051-4f08f862d64d" (UID: "a52102ed-4584-44cb-b051-4f08f862d64d"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.956439 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52102ed-4584-44cb-b051-4f08f862d64d-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a52102ed-4584-44cb-b051-4f08f862d64d" (UID: "a52102ed-4584-44cb-b051-4f08f862d64d"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.957056 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "a52102ed-4584-44cb-b051-4f08f862d64d" (UID: "a52102ed-4584-44cb-b051-4f08f862d64d"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.963740 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "a52102ed-4584-44cb-b051-4f08f862d64d" (UID: "a52102ed-4584-44cb-b051-4f08f862d64d"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.980996 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a52102ed-4584-44cb-b051-4f08f862d64d-config-out" (OuterVolumeSpecName: "config-out") pod "a52102ed-4584-44cb-b051-4f08f862d64d" (UID: "a52102ed-4584-44cb-b051-4f08f862d64d"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.981035 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-config" (OuterVolumeSpecName: "config") pod "a52102ed-4584-44cb-b051-4f08f862d64d" (UID: "a52102ed-4584-44cb-b051-4f08f862d64d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:50:31 crc kubenswrapper[4807]: I1202 20:50:31.981107 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "a52102ed-4584-44cb-b051-4f08f862d64d" (UID: "a52102ed-4584-44cb-b051-4f08f862d64d"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.048087 4807 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a52102ed-4584-44cb-b051-4f08f862d64d-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.048115 4807 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.048128 4807 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.048138 4807 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.048146 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.048154 4807 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a52102ed-4584-44cb-b051-4f08f862d64d-config-out\") on node \"crc\" DevicePath \"\"" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.048162 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m4vb\" (UniqueName: \"kubernetes.io/projected/a52102ed-4584-44cb-b051-4f08f862d64d-kube-api-access-7m4vb\") on node \"crc\" DevicePath \"\"" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.048171 4807 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.048179 4807 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a52102ed-4584-44cb-b051-4f08f862d64d-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.090149 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "a52102ed-4584-44cb-b051-4f08f862d64d" (UID: "a52102ed-4584-44cb-b051-4f08f862d64d"). InnerVolumeSpecName "pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.108396 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config" (OuterVolumeSpecName: "web-config") pod "a52102ed-4584-44cb-b051-4f08f862d64d" (UID: "a52102ed-4584-44cb-b051-4f08f862d64d"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.153022 4807 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a52102ed-4584-44cb-b051-4f08f862d64d-web-config\") on node \"crc\" DevicePath \"\"" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.153086 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") on node \"crc\" " Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.212805 4807 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.212971 4807 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f") on node "crc" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.255802 4807 reconciler_common.go:293] "Volume detached for volume \"pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") on node \"crc\" DevicePath \"\"" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.372768 4807 generic.go:334] "Generic (PLEG): container finished" podID="a52102ed-4584-44cb-b051-4f08f862d64d" containerID="a363a8c00fb72da0b549f2f69fa6ae74750dbced890a1d67e834fb32a98f2ad9" exitCode=0 Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.372854 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.372887 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a52102ed-4584-44cb-b051-4f08f862d64d","Type":"ContainerDied","Data":"a363a8c00fb72da0b549f2f69fa6ae74750dbced890a1d67e834fb32a98f2ad9"} Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.373192 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a52102ed-4584-44cb-b051-4f08f862d64d","Type":"ContainerDied","Data":"4a7eeb63fd16d69630aea44b87cbfa4634a598770bfb58af595b27d41e9a0e2f"} Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.373220 4807 scope.go:117] "RemoveContainer" containerID="57df907f8d5ec55e94958ff44379306e20d406fcfae987fd7aa5e8b49ccc8e5b" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.391886 4807 scope.go:117] "RemoveContainer" containerID="a363a8c00fb72da0b549f2f69fa6ae74750dbced890a1d67e834fb32a98f2ad9" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.413253 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.416982 4807 scope.go:117] "RemoveContainer" containerID="7327b5e7d54721e1df6f00230469ab9101807546a999fae341c6b722873e27b7" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.421826 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.445149 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 20:50:32 crc kubenswrapper[4807]: E1202 20:50:32.445603 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b4b764-c084-45b9-9490-dc587c8ef441" containerName="extract-utilities" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.445622 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b4b764-c084-45b9-9490-dc587c8ef441" containerName="extract-utilities" Dec 02 20:50:32 crc kubenswrapper[4807]: E1202 20:50:32.445640 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b4b764-c084-45b9-9490-dc587c8ef441" containerName="extract-content" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.445647 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b4b764-c084-45b9-9490-dc587c8ef441" containerName="extract-content" Dec 02 20:50:32 crc kubenswrapper[4807]: E1202 20:50:32.445670 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52102ed-4584-44cb-b051-4f08f862d64d" containerName="thanos-sidecar" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.445677 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52102ed-4584-44cb-b051-4f08f862d64d" containerName="thanos-sidecar" Dec 02 20:50:32 crc kubenswrapper[4807]: E1202 20:50:32.445684 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52102ed-4584-44cb-b051-4f08f862d64d" containerName="config-reloader" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.445690 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52102ed-4584-44cb-b051-4f08f862d64d" containerName="config-reloader" Dec 02 20:50:32 crc kubenswrapper[4807]: E1202 20:50:32.445706 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b4b764-c084-45b9-9490-dc587c8ef441" containerName="registry-server" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.445711 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b4b764-c084-45b9-9490-dc587c8ef441" containerName="registry-server" Dec 02 20:50:32 crc kubenswrapper[4807]: E1202 20:50:32.445733 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52102ed-4584-44cb-b051-4f08f862d64d" containerName="prometheus" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.445740 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52102ed-4584-44cb-b051-4f08f862d64d" containerName="prometheus" Dec 02 20:50:32 crc kubenswrapper[4807]: E1202 20:50:32.445754 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52102ed-4584-44cb-b051-4f08f862d64d" containerName="init-config-reloader" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.445763 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52102ed-4584-44cb-b051-4f08f862d64d" containerName="init-config-reloader" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.445940 4807 scope.go:117] "RemoveContainer" containerID="9a669c7afde69cecd7e6b175cf737aff63fc1a82bbb1776856e4b6a7f335343e" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.445951 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52102ed-4584-44cb-b051-4f08f862d64d" containerName="thanos-sidecar" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.446081 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="10b4b764-c084-45b9-9490-dc587c8ef441" containerName="registry-server" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.446120 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52102ed-4584-44cb-b051-4f08f862d64d" containerName="config-reloader" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.446164 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52102ed-4584-44cb-b051-4f08f862d64d" containerName="prometheus" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.448139 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.451045 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.456397 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.456513 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.456836 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-nlqxt" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.456866 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.486354 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.495356 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.508825 4807 scope.go:117] "RemoveContainer" containerID="57df907f8d5ec55e94958ff44379306e20d406fcfae987fd7aa5e8b49ccc8e5b" Dec 02 20:50:32 crc kubenswrapper[4807]: E1202 20:50:32.527190 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57df907f8d5ec55e94958ff44379306e20d406fcfae987fd7aa5e8b49ccc8e5b\": container with ID starting with 57df907f8d5ec55e94958ff44379306e20d406fcfae987fd7aa5e8b49ccc8e5b not found: ID does not exist" containerID="57df907f8d5ec55e94958ff44379306e20d406fcfae987fd7aa5e8b49ccc8e5b" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.527242 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57df907f8d5ec55e94958ff44379306e20d406fcfae987fd7aa5e8b49ccc8e5b"} err="failed to get container status \"57df907f8d5ec55e94958ff44379306e20d406fcfae987fd7aa5e8b49ccc8e5b\": rpc error: code = NotFound desc = could not find container \"57df907f8d5ec55e94958ff44379306e20d406fcfae987fd7aa5e8b49ccc8e5b\": container with ID starting with 57df907f8d5ec55e94958ff44379306e20d406fcfae987fd7aa5e8b49ccc8e5b not found: ID does not exist" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.527273 4807 scope.go:117] "RemoveContainer" containerID="a363a8c00fb72da0b549f2f69fa6ae74750dbced890a1d67e834fb32a98f2ad9" Dec 02 20:50:32 crc kubenswrapper[4807]: E1202 20:50:32.527786 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a363a8c00fb72da0b549f2f69fa6ae74750dbced890a1d67e834fb32a98f2ad9\": container with ID starting with a363a8c00fb72da0b549f2f69fa6ae74750dbced890a1d67e834fb32a98f2ad9 not found: ID does not exist" containerID="a363a8c00fb72da0b549f2f69fa6ae74750dbced890a1d67e834fb32a98f2ad9" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.527842 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a363a8c00fb72da0b549f2f69fa6ae74750dbced890a1d67e834fb32a98f2ad9"} err="failed to get container status \"a363a8c00fb72da0b549f2f69fa6ae74750dbced890a1d67e834fb32a98f2ad9\": rpc error: code = NotFound desc = could not find container \"a363a8c00fb72da0b549f2f69fa6ae74750dbced890a1d67e834fb32a98f2ad9\": container with ID starting with a363a8c00fb72da0b549f2f69fa6ae74750dbced890a1d67e834fb32a98f2ad9 not found: ID does not exist" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.527867 4807 scope.go:117] "RemoveContainer" containerID="7327b5e7d54721e1df6f00230469ab9101807546a999fae341c6b722873e27b7" Dec 02 20:50:32 crc kubenswrapper[4807]: E1202 20:50:32.528163 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7327b5e7d54721e1df6f00230469ab9101807546a999fae341c6b722873e27b7\": container with ID starting with 7327b5e7d54721e1df6f00230469ab9101807546a999fae341c6b722873e27b7 not found: ID does not exist" containerID="7327b5e7d54721e1df6f00230469ab9101807546a999fae341c6b722873e27b7" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.528183 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7327b5e7d54721e1df6f00230469ab9101807546a999fae341c6b722873e27b7"} err="failed to get container status \"7327b5e7d54721e1df6f00230469ab9101807546a999fae341c6b722873e27b7\": rpc error: code = NotFound desc = could not find container \"7327b5e7d54721e1df6f00230469ab9101807546a999fae341c6b722873e27b7\": container with ID starting with 7327b5e7d54721e1df6f00230469ab9101807546a999fae341c6b722873e27b7 not found: ID does not exist" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.528198 4807 scope.go:117] "RemoveContainer" containerID="9a669c7afde69cecd7e6b175cf737aff63fc1a82bbb1776856e4b6a7f335343e" Dec 02 20:50:32 crc kubenswrapper[4807]: E1202 20:50:32.528409 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a669c7afde69cecd7e6b175cf737aff63fc1a82bbb1776856e4b6a7f335343e\": container with ID starting with 9a669c7afde69cecd7e6b175cf737aff63fc1a82bbb1776856e4b6a7f335343e not found: ID does not exist" containerID="9a669c7afde69cecd7e6b175cf737aff63fc1a82bbb1776856e4b6a7f335343e" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.528425 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a669c7afde69cecd7e6b175cf737aff63fc1a82bbb1776856e4b6a7f335343e"} err="failed to get container status \"9a669c7afde69cecd7e6b175cf737aff63fc1a82bbb1776856e4b6a7f335343e\": rpc error: code = NotFound desc = could not find container \"9a669c7afde69cecd7e6b175cf737aff63fc1a82bbb1776856e4b6a7f335343e\": container with ID starting with 9a669c7afde69cecd7e6b175cf737aff63fc1a82bbb1776856e4b6a7f335343e not found: ID does not exist" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.607139 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.607192 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.607221 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.607243 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.607366 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-config\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.607568 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.607629 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.607664 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k79b9\" (UniqueName: \"kubernetes.io/projected/35ee0b99-6360-4b6d-bc80-8b420b1054c0-kube-api-access-k79b9\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.607698 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/35ee0b99-6360-4b6d-bc80-8b420b1054c0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.607794 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/35ee0b99-6360-4b6d-bc80-8b420b1054c0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.608003 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/35ee0b99-6360-4b6d-bc80-8b420b1054c0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.710133 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.710216 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.710262 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/35ee0b99-6360-4b6d-bc80-8b420b1054c0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.710296 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k79b9\" (UniqueName: \"kubernetes.io/projected/35ee0b99-6360-4b6d-bc80-8b420b1054c0-kube-api-access-k79b9\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.710362 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/35ee0b99-6360-4b6d-bc80-8b420b1054c0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.710424 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/35ee0b99-6360-4b6d-bc80-8b420b1054c0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.710561 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.710601 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.710650 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.710694 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.710778 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-config\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.711817 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/35ee0b99-6360-4b6d-bc80-8b420b1054c0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.717695 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/35ee0b99-6360-4b6d-bc80-8b420b1054c0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.718024 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-config\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.719680 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.719895 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.720459 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.721062 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.722148 4807 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.722184 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8dd81587a0a3f5d67a5d533af4320b55477e158168be00efde5dc29e79f819c0/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.724941 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/35ee0b99-6360-4b6d-bc80-8b420b1054c0-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.726841 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/35ee0b99-6360-4b6d-bc80-8b420b1054c0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.731991 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k79b9\" (UniqueName: \"kubernetes.io/projected/35ee0b99-6360-4b6d-bc80-8b420b1054c0-kube-api-access-k79b9\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.775543 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2c20eeba-2e9f-4506-9444-ba757e530c0f\") pod \"prometheus-metric-storage-0\" (UID: \"35ee0b99-6360-4b6d-bc80-8b420b1054c0\") " pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:32 crc kubenswrapper[4807]: I1202 20:50:32.993129 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a52102ed-4584-44cb-b051-4f08f862d64d" path="/var/lib/kubelet/pods/a52102ed-4584-44cb-b051-4f08f862d64d/volumes" Dec 02 20:50:33 crc kubenswrapper[4807]: I1202 20:50:33.070108 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 02 20:50:33 crc kubenswrapper[4807]: I1202 20:50:33.599596 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 02 20:50:33 crc kubenswrapper[4807]: I1202 20:50:33.973738 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:50:33 crc kubenswrapper[4807]: E1202 20:50:33.974310 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:50:34 crc kubenswrapper[4807]: I1202 20:50:34.401080 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"35ee0b99-6360-4b6d-bc80-8b420b1054c0","Type":"ContainerStarted","Data":"e14c8fa9bb9a385b53455e975760a5e7ecb9444e91359917ff54f0846146bbfe"} Dec 02 20:50:38 crc kubenswrapper[4807]: I1202 20:50:38.450675 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"35ee0b99-6360-4b6d-bc80-8b420b1054c0","Type":"ContainerStarted","Data":"2ec245d75ccd357022a7f2f7b69360c9dd14df71b5be0d95338aa013f502b623"} Dec 02 20:50:47 crc kubenswrapper[4807]: I1202 20:50:47.564977 4807 generic.go:334] "Generic (PLEG): container finished" podID="35ee0b99-6360-4b6d-bc80-8b420b1054c0" containerID="2ec245d75ccd357022a7f2f7b69360c9dd14df71b5be0d95338aa013f502b623" exitCode=0 Dec 02 20:50:47 crc kubenswrapper[4807]: I1202 20:50:47.565131 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"35ee0b99-6360-4b6d-bc80-8b420b1054c0","Type":"ContainerDied","Data":"2ec245d75ccd357022a7f2f7b69360c9dd14df71b5be0d95338aa013f502b623"} Dec 02 20:50:47 crc kubenswrapper[4807]: I1202 20:50:47.972417 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:50:47 crc kubenswrapper[4807]: E1202 20:50:47.972986 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:50:48 crc kubenswrapper[4807]: I1202 20:50:48.583398 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"35ee0b99-6360-4b6d-bc80-8b420b1054c0","Type":"ContainerStarted","Data":"b3c0533b25615920f3f3f25759de6808449e786b1b8c06abacc080cb80469b6b"} Dec 02 20:50:51 crc kubenswrapper[4807]: I1202 20:50:51.612244 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"35ee0b99-6360-4b6d-bc80-8b420b1054c0","Type":"ContainerStarted","Data":"4d670a1c2e584b099413a81fca3b0c6aafd85635830da3bd7fd1c1831ebc1c38"} Dec 02 20:50:51 crc kubenswrapper[4807]: I1202 20:50:51.612846 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"35ee0b99-6360-4b6d-bc80-8b420b1054c0","Type":"ContainerStarted","Data":"31a79ec8f7710e95d9f2299a7316ef89b679cfa6b2a3e150cdbeb00d4774791e"} Dec 02 20:50:51 crc kubenswrapper[4807]: I1202 20:50:51.655630 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=19.655602643 podStartE2EDuration="19.655602643s" podCreationTimestamp="2025-12-02 20:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 20:50:51.641314606 +0000 UTC m=+3186.942222101" watchObservedRunningTime="2025-12-02 20:50:51.655602643 +0000 UTC m=+3186.956510148" Dec 02 20:50:53 crc kubenswrapper[4807]: I1202 20:50:53.070965 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 02 20:51:00 crc kubenswrapper[4807]: I1202 20:51:00.972412 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:51:00 crc kubenswrapper[4807]: E1202 20:51:00.973259 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:51:03 crc kubenswrapper[4807]: I1202 20:51:03.070889 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 02 20:51:03 crc kubenswrapper[4807]: I1202 20:51:03.076354 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 02 20:51:03 crc kubenswrapper[4807]: I1202 20:51:03.757099 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 02 20:51:12 crc kubenswrapper[4807]: I1202 20:51:12.972952 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:51:12 crc kubenswrapper[4807]: E1202 20:51:12.973898 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.252162 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.254395 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.256559 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6wh86" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.256562 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.257674 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.259707 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.263692 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.310469 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.310546 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.310588 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e6826607-5100-439e-b82d-224b312a6faa-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.310630 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e6826607-5100-439e-b82d-224b312a6faa-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.310883 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqxx5\" (UniqueName: \"kubernetes.io/projected/e6826607-5100-439e-b82d-224b312a6faa-kube-api-access-tqxx5\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.310955 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6826607-5100-439e-b82d-224b312a6faa-config-data\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.311126 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.311171 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.311211 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e6826607-5100-439e-b82d-224b312a6faa-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.413194 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.413282 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.413330 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e6826607-5100-439e-b82d-224b312a6faa-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.413376 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e6826607-5100-439e-b82d-224b312a6faa-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.413492 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqxx5\" (UniqueName: \"kubernetes.io/projected/e6826607-5100-439e-b82d-224b312a6faa-kube-api-access-tqxx5\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.413534 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6826607-5100-439e-b82d-224b312a6faa-config-data\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.413635 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.413672 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.413738 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e6826607-5100-439e-b82d-224b312a6faa-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.414885 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e6826607-5100-439e-b82d-224b312a6faa-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.415198 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e6826607-5100-439e-b82d-224b312a6faa-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.415709 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e6826607-5100-439e-b82d-224b312a6faa-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.416198 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6826607-5100-439e-b82d-224b312a6faa-config-data\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.416910 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.422437 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.422982 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.436738 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.444745 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqxx5\" (UniqueName: \"kubernetes.io/projected/e6826607-5100-439e-b82d-224b312a6faa-kube-api-access-tqxx5\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.479397 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.593031 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.848909 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 20:51:18 crc kubenswrapper[4807]: W1202 20:51:18.850561 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6826607_5100_439e_b82d_224b312a6faa.slice/crio-f4e6fb4235efc99236d6accf9b3a178915df5202af061ae369943751065a305d WatchSource:0}: Error finding container f4e6fb4235efc99236d6accf9b3a178915df5202af061ae369943751065a305d: Status 404 returned error can't find the container with id f4e6fb4235efc99236d6accf9b3a178915df5202af061ae369943751065a305d Dec 02 20:51:18 crc kubenswrapper[4807]: I1202 20:51:18.932980 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e6826607-5100-439e-b82d-224b312a6faa","Type":"ContainerStarted","Data":"f4e6fb4235efc99236d6accf9b3a178915df5202af061ae369943751065a305d"} Dec 02 20:51:24 crc kubenswrapper[4807]: I1202 20:51:24.982690 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:51:24 crc kubenswrapper[4807]: E1202 20:51:24.983206 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:51:32 crc kubenswrapper[4807]: I1202 20:51:32.134132 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e6826607-5100-439e-b82d-224b312a6faa","Type":"ContainerStarted","Data":"1737b084f5f6e04b118288204293c483cd4d470b8df535f2e80004e54efd3ea1"} Dec 02 20:51:32 crc kubenswrapper[4807]: I1202 20:51:32.160590 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.627057702 podStartE2EDuration="15.160561097s" podCreationTimestamp="2025-12-02 20:51:17 +0000 UTC" firstStartedPulling="2025-12-02 20:51:18.853758009 +0000 UTC m=+3214.154665514" lastFinishedPulling="2025-12-02 20:51:30.387261404 +0000 UTC m=+3225.688168909" observedRunningTime="2025-12-02 20:51:32.156130771 +0000 UTC m=+3227.457038306" watchObservedRunningTime="2025-12-02 20:51:32.160561097 +0000 UTC m=+3227.461468632" Dec 02 20:51:36 crc kubenswrapper[4807]: I1202 20:51:36.973828 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:51:36 crc kubenswrapper[4807]: E1202 20:51:36.975031 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:51:49 crc kubenswrapper[4807]: I1202 20:51:49.973109 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:51:49 crc kubenswrapper[4807]: E1202 20:51:49.974008 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:52:00 crc kubenswrapper[4807]: I1202 20:52:00.974665 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:52:00 crc kubenswrapper[4807]: E1202 20:52:00.975695 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:52:12 crc kubenswrapper[4807]: I1202 20:52:12.972169 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:52:12 crc kubenswrapper[4807]: E1202 20:52:12.972911 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:52:26 crc kubenswrapper[4807]: I1202 20:52:26.973065 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:52:26 crc kubenswrapper[4807]: E1202 20:52:26.974270 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:52:39 crc kubenswrapper[4807]: I1202 20:52:39.973234 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:52:40 crc kubenswrapper[4807]: I1202 20:52:40.423276 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"1dfd2daec5bfc89c1478b17450a777538e2101d84f5e4b5fbfbaa5205355f2b0"} Dec 02 20:54:58 crc kubenswrapper[4807]: I1202 20:54:58.292862 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:54:58 crc kubenswrapper[4807]: I1202 20:54:58.293707 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:55:28 crc kubenswrapper[4807]: I1202 20:55:28.293877 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:55:28 crc kubenswrapper[4807]: I1202 20:55:28.294781 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:55:58 crc kubenswrapper[4807]: I1202 20:55:58.293254 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:55:58 crc kubenswrapper[4807]: I1202 20:55:58.293726 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:55:58 crc kubenswrapper[4807]: I1202 20:55:58.293778 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 20:55:58 crc kubenswrapper[4807]: I1202 20:55:58.294576 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1dfd2daec5bfc89c1478b17450a777538e2101d84f5e4b5fbfbaa5205355f2b0"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 20:55:58 crc kubenswrapper[4807]: I1202 20:55:58.294622 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://1dfd2daec5bfc89c1478b17450a777538e2101d84f5e4b5fbfbaa5205355f2b0" gracePeriod=600 Dec 02 20:55:58 crc kubenswrapper[4807]: I1202 20:55:58.816988 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="1dfd2daec5bfc89c1478b17450a777538e2101d84f5e4b5fbfbaa5205355f2b0" exitCode=0 Dec 02 20:55:58 crc kubenswrapper[4807]: I1202 20:55:58.817071 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"1dfd2daec5bfc89c1478b17450a777538e2101d84f5e4b5fbfbaa5205355f2b0"} Dec 02 20:55:58 crc kubenswrapper[4807]: I1202 20:55:58.817578 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0"} Dec 02 20:55:58 crc kubenswrapper[4807]: I1202 20:55:58.817601 4807 scope.go:117] "RemoveContainer" containerID="f4cda41ad87a537587163ee90906ca9b1d6ae89e2e319625afd66bf7141c74ae" Dec 02 20:57:18 crc kubenswrapper[4807]: I1202 20:57:18.816964 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cr7xz"] Dec 02 20:57:18 crc kubenswrapper[4807]: I1202 20:57:18.820773 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 20:57:18 crc kubenswrapper[4807]: I1202 20:57:18.848399 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cr7xz"] Dec 02 20:57:18 crc kubenswrapper[4807]: I1202 20:57:18.946922 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e62aa90-5be8-469a-b984-6adcb92e4a91-catalog-content\") pod \"redhat-operators-cr7xz\" (UID: \"7e62aa90-5be8-469a-b984-6adcb92e4a91\") " pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 20:57:18 crc kubenswrapper[4807]: I1202 20:57:18.946979 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e62aa90-5be8-469a-b984-6adcb92e4a91-utilities\") pod \"redhat-operators-cr7xz\" (UID: \"7e62aa90-5be8-469a-b984-6adcb92e4a91\") " pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 20:57:18 crc kubenswrapper[4807]: I1202 20:57:18.947083 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mjgw\" (UniqueName: \"kubernetes.io/projected/7e62aa90-5be8-469a-b984-6adcb92e4a91-kube-api-access-9mjgw\") pod \"redhat-operators-cr7xz\" (UID: \"7e62aa90-5be8-469a-b984-6adcb92e4a91\") " pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.049348 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e62aa90-5be8-469a-b984-6adcb92e4a91-catalog-content\") pod \"redhat-operators-cr7xz\" (UID: \"7e62aa90-5be8-469a-b984-6adcb92e4a91\") " pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.049419 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e62aa90-5be8-469a-b984-6adcb92e4a91-utilities\") pod \"redhat-operators-cr7xz\" (UID: \"7e62aa90-5be8-469a-b984-6adcb92e4a91\") " pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.049501 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mjgw\" (UniqueName: \"kubernetes.io/projected/7e62aa90-5be8-469a-b984-6adcb92e4a91-kube-api-access-9mjgw\") pod \"redhat-operators-cr7xz\" (UID: \"7e62aa90-5be8-469a-b984-6adcb92e4a91\") " pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.049973 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e62aa90-5be8-469a-b984-6adcb92e4a91-catalog-content\") pod \"redhat-operators-cr7xz\" (UID: \"7e62aa90-5be8-469a-b984-6adcb92e4a91\") " pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.050493 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e62aa90-5be8-469a-b984-6adcb92e4a91-utilities\") pod \"redhat-operators-cr7xz\" (UID: \"7e62aa90-5be8-469a-b984-6adcb92e4a91\") " pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.086757 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mjgw\" (UniqueName: \"kubernetes.io/projected/7e62aa90-5be8-469a-b984-6adcb92e4a91-kube-api-access-9mjgw\") pod \"redhat-operators-cr7xz\" (UID: \"7e62aa90-5be8-469a-b984-6adcb92e4a91\") " pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.152208 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.608643 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-94g68"] Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.611767 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.618142 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-94g68"] Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.719382 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cr7xz"] Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.762616 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/162650fa-0e30-49ff-8cf9-fdee15219f64-catalog-content\") pod \"certified-operators-94g68\" (UID: \"162650fa-0e30-49ff-8cf9-fdee15219f64\") " pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.762687 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/162650fa-0e30-49ff-8cf9-fdee15219f64-utilities\") pod \"certified-operators-94g68\" (UID: \"162650fa-0e30-49ff-8cf9-fdee15219f64\") " pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.762828 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjck8\" (UniqueName: \"kubernetes.io/projected/162650fa-0e30-49ff-8cf9-fdee15219f64-kube-api-access-tjck8\") pod \"certified-operators-94g68\" (UID: \"162650fa-0e30-49ff-8cf9-fdee15219f64\") " pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.864527 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/162650fa-0e30-49ff-8cf9-fdee15219f64-catalog-content\") pod \"certified-operators-94g68\" (UID: \"162650fa-0e30-49ff-8cf9-fdee15219f64\") " pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.864959 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/162650fa-0e30-49ff-8cf9-fdee15219f64-utilities\") pod \"certified-operators-94g68\" (UID: \"162650fa-0e30-49ff-8cf9-fdee15219f64\") " pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.865098 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjck8\" (UniqueName: \"kubernetes.io/projected/162650fa-0e30-49ff-8cf9-fdee15219f64-kube-api-access-tjck8\") pod \"certified-operators-94g68\" (UID: \"162650fa-0e30-49ff-8cf9-fdee15219f64\") " pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.865397 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/162650fa-0e30-49ff-8cf9-fdee15219f64-catalog-content\") pod \"certified-operators-94g68\" (UID: \"162650fa-0e30-49ff-8cf9-fdee15219f64\") " pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.865628 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/162650fa-0e30-49ff-8cf9-fdee15219f64-utilities\") pod \"certified-operators-94g68\" (UID: \"162650fa-0e30-49ff-8cf9-fdee15219f64\") " pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.888155 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjck8\" (UniqueName: \"kubernetes.io/projected/162650fa-0e30-49ff-8cf9-fdee15219f64-kube-api-access-tjck8\") pod \"certified-operators-94g68\" (UID: \"162650fa-0e30-49ff-8cf9-fdee15219f64\") " pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:19 crc kubenswrapper[4807]: I1202 20:57:19.938013 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:20 crc kubenswrapper[4807]: I1202 20:57:20.534569 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-94g68"] Dec 02 20:57:20 crc kubenswrapper[4807]: W1202 20:57:20.553456 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod162650fa_0e30_49ff_8cf9_fdee15219f64.slice/crio-3661f725e6f4ee01d3d98efdbc76bd40c7e5c4b5bc7025515c26b0d3664fa733 WatchSource:0}: Error finding container 3661f725e6f4ee01d3d98efdbc76bd40c7e5c4b5bc7025515c26b0d3664fa733: Status 404 returned error can't find the container with id 3661f725e6f4ee01d3d98efdbc76bd40c7e5c4b5bc7025515c26b0d3664fa733 Dec 02 20:57:20 crc kubenswrapper[4807]: I1202 20:57:20.720471 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94g68" event={"ID":"162650fa-0e30-49ff-8cf9-fdee15219f64","Type":"ContainerStarted","Data":"3661f725e6f4ee01d3d98efdbc76bd40c7e5c4b5bc7025515c26b0d3664fa733"} Dec 02 20:57:20 crc kubenswrapper[4807]: I1202 20:57:20.722609 4807 generic.go:334] "Generic (PLEG): container finished" podID="7e62aa90-5be8-469a-b984-6adcb92e4a91" containerID="5ac98f55d5351b27e1e1cddf0e68dac26fcc4f008ab4d21dc18134f06a4eb6fc" exitCode=0 Dec 02 20:57:20 crc kubenswrapper[4807]: I1202 20:57:20.722656 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cr7xz" event={"ID":"7e62aa90-5be8-469a-b984-6adcb92e4a91","Type":"ContainerDied","Data":"5ac98f55d5351b27e1e1cddf0e68dac26fcc4f008ab4d21dc18134f06a4eb6fc"} Dec 02 20:57:20 crc kubenswrapper[4807]: I1202 20:57:20.722684 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cr7xz" event={"ID":"7e62aa90-5be8-469a-b984-6adcb92e4a91","Type":"ContainerStarted","Data":"14d745c45a684d5b6dd881b66faf9365c2d28026ec280483c0b7de7ed681cebc"} Dec 02 20:57:20 crc kubenswrapper[4807]: I1202 20:57:20.724798 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 20:57:21 crc kubenswrapper[4807]: I1202 20:57:21.735266 4807 generic.go:334] "Generic (PLEG): container finished" podID="162650fa-0e30-49ff-8cf9-fdee15219f64" containerID="7a0a3cea37b236ea5f8e49a99bcb7faed146129b751e1bb6cf3290ca9ef3d747" exitCode=0 Dec 02 20:57:21 crc kubenswrapper[4807]: I1202 20:57:21.735377 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94g68" event={"ID":"162650fa-0e30-49ff-8cf9-fdee15219f64","Type":"ContainerDied","Data":"7a0a3cea37b236ea5f8e49a99bcb7faed146129b751e1bb6cf3290ca9ef3d747"} Dec 02 20:57:22 crc kubenswrapper[4807]: I1202 20:57:22.409395 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wbmxp"] Dec 02 20:57:22 crc kubenswrapper[4807]: I1202 20:57:22.412355 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:22 crc kubenswrapper[4807]: I1202 20:57:22.425055 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbmxp"] Dec 02 20:57:22 crc kubenswrapper[4807]: I1202 20:57:22.518773 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbc1866a-3ae4-4965-9794-32c61e384262-utilities\") pod \"redhat-marketplace-wbmxp\" (UID: \"dbc1866a-3ae4-4965-9794-32c61e384262\") " pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:22 crc kubenswrapper[4807]: I1202 20:57:22.519110 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbc1866a-3ae4-4965-9794-32c61e384262-catalog-content\") pod \"redhat-marketplace-wbmxp\" (UID: \"dbc1866a-3ae4-4965-9794-32c61e384262\") " pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:22 crc kubenswrapper[4807]: I1202 20:57:22.519164 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8h6p\" (UniqueName: \"kubernetes.io/projected/dbc1866a-3ae4-4965-9794-32c61e384262-kube-api-access-m8h6p\") pod \"redhat-marketplace-wbmxp\" (UID: \"dbc1866a-3ae4-4965-9794-32c61e384262\") " pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:22 crc kubenswrapper[4807]: I1202 20:57:22.621348 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbc1866a-3ae4-4965-9794-32c61e384262-catalog-content\") pod \"redhat-marketplace-wbmxp\" (UID: \"dbc1866a-3ae4-4965-9794-32c61e384262\") " pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:22 crc kubenswrapper[4807]: I1202 20:57:22.621479 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8h6p\" (UniqueName: \"kubernetes.io/projected/dbc1866a-3ae4-4965-9794-32c61e384262-kube-api-access-m8h6p\") pod \"redhat-marketplace-wbmxp\" (UID: \"dbc1866a-3ae4-4965-9794-32c61e384262\") " pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:22 crc kubenswrapper[4807]: I1202 20:57:22.621643 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbc1866a-3ae4-4965-9794-32c61e384262-utilities\") pod \"redhat-marketplace-wbmxp\" (UID: \"dbc1866a-3ae4-4965-9794-32c61e384262\") " pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:22 crc kubenswrapper[4807]: I1202 20:57:22.622047 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbc1866a-3ae4-4965-9794-32c61e384262-catalog-content\") pod \"redhat-marketplace-wbmxp\" (UID: \"dbc1866a-3ae4-4965-9794-32c61e384262\") " pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:22 crc kubenswrapper[4807]: I1202 20:57:22.622193 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbc1866a-3ae4-4965-9794-32c61e384262-utilities\") pod \"redhat-marketplace-wbmxp\" (UID: \"dbc1866a-3ae4-4965-9794-32c61e384262\") " pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:22 crc kubenswrapper[4807]: I1202 20:57:22.660775 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8h6p\" (UniqueName: \"kubernetes.io/projected/dbc1866a-3ae4-4965-9794-32c61e384262-kube-api-access-m8h6p\") pod \"redhat-marketplace-wbmxp\" (UID: \"dbc1866a-3ae4-4965-9794-32c61e384262\") " pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:22 crc kubenswrapper[4807]: I1202 20:57:22.738653 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:22 crc kubenswrapper[4807]: I1202 20:57:22.751478 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94g68" event={"ID":"162650fa-0e30-49ff-8cf9-fdee15219f64","Type":"ContainerStarted","Data":"6d53c057ae642d32be84031ced64e10e9b660a141dd222d074a05a2ae40b25c9"} Dec 02 20:57:23 crc kubenswrapper[4807]: I1202 20:57:23.247643 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbmxp"] Dec 02 20:57:23 crc kubenswrapper[4807]: I1202 20:57:23.767856 4807 generic.go:334] "Generic (PLEG): container finished" podID="162650fa-0e30-49ff-8cf9-fdee15219f64" containerID="6d53c057ae642d32be84031ced64e10e9b660a141dd222d074a05a2ae40b25c9" exitCode=0 Dec 02 20:57:23 crc kubenswrapper[4807]: I1202 20:57:23.767940 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94g68" event={"ID":"162650fa-0e30-49ff-8cf9-fdee15219f64","Type":"ContainerDied","Data":"6d53c057ae642d32be84031ced64e10e9b660a141dd222d074a05a2ae40b25c9"} Dec 02 20:57:23 crc kubenswrapper[4807]: I1202 20:57:23.769902 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbmxp" event={"ID":"dbc1866a-3ae4-4965-9794-32c61e384262","Type":"ContainerStarted","Data":"d3b0213367e73f7a21e7f9d89f7d435579921feee5a0463c1a89648fca44df6f"} Dec 02 20:57:23 crc kubenswrapper[4807]: I1202 20:57:23.769936 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbmxp" event={"ID":"dbc1866a-3ae4-4965-9794-32c61e384262","Type":"ContainerStarted","Data":"dc9754461a4dcd30b5a2cefc6383cb0a759aae448e34814b2c916fdcd4f610eb"} Dec 02 20:57:24 crc kubenswrapper[4807]: I1202 20:57:24.781644 4807 generic.go:334] "Generic (PLEG): container finished" podID="dbc1866a-3ae4-4965-9794-32c61e384262" containerID="d3b0213367e73f7a21e7f9d89f7d435579921feee5a0463c1a89648fca44df6f" exitCode=0 Dec 02 20:57:24 crc kubenswrapper[4807]: I1202 20:57:24.781739 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbmxp" event={"ID":"dbc1866a-3ae4-4965-9794-32c61e384262","Type":"ContainerDied","Data":"d3b0213367e73f7a21e7f9d89f7d435579921feee5a0463c1a89648fca44df6f"} Dec 02 20:57:24 crc kubenswrapper[4807]: I1202 20:57:24.790262 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94g68" event={"ID":"162650fa-0e30-49ff-8cf9-fdee15219f64","Type":"ContainerStarted","Data":"bc0982db5164ccd2433536b7c5ebbe1c98f33a412c1d2740302bbd0e160bafcd"} Dec 02 20:57:24 crc kubenswrapper[4807]: I1202 20:57:24.830510 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-94g68" podStartSLOduration=3.299955508 podStartE2EDuration="5.830485426s" podCreationTimestamp="2025-12-02 20:57:19 +0000 UTC" firstStartedPulling="2025-12-02 20:57:21.738156821 +0000 UTC m=+3577.039064326" lastFinishedPulling="2025-12-02 20:57:24.268686749 +0000 UTC m=+3579.569594244" observedRunningTime="2025-12-02 20:57:24.81990181 +0000 UTC m=+3580.120809305" watchObservedRunningTime="2025-12-02 20:57:24.830485426 +0000 UTC m=+3580.131392931" Dec 02 20:57:25 crc kubenswrapper[4807]: I1202 20:57:25.803181 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbmxp" event={"ID":"dbc1866a-3ae4-4965-9794-32c61e384262","Type":"ContainerStarted","Data":"9dd1dbcc65d6f5c454522680f059348fd8f6be09a2668bd176289d2f816f63a1"} Dec 02 20:57:26 crc kubenswrapper[4807]: I1202 20:57:26.878664 4807 generic.go:334] "Generic (PLEG): container finished" podID="dbc1866a-3ae4-4965-9794-32c61e384262" containerID="9dd1dbcc65d6f5c454522680f059348fd8f6be09a2668bd176289d2f816f63a1" exitCode=0 Dec 02 20:57:26 crc kubenswrapper[4807]: I1202 20:57:26.878708 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbmxp" event={"ID":"dbc1866a-3ae4-4965-9794-32c61e384262","Type":"ContainerDied","Data":"9dd1dbcc65d6f5c454522680f059348fd8f6be09a2668bd176289d2f816f63a1"} Dec 02 20:57:29 crc kubenswrapper[4807]: I1202 20:57:29.939147 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:29 crc kubenswrapper[4807]: I1202 20:57:29.939622 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:31 crc kubenswrapper[4807]: I1202 20:57:31.001966 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-94g68" podUID="162650fa-0e30-49ff-8cf9-fdee15219f64" containerName="registry-server" probeResult="failure" output=< Dec 02 20:57:31 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Dec 02 20:57:31 crc kubenswrapper[4807]: > Dec 02 20:57:31 crc kubenswrapper[4807]: I1202 20:57:31.962636 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cr7xz" event={"ID":"7e62aa90-5be8-469a-b984-6adcb92e4a91","Type":"ContainerStarted","Data":"360867d23729c69afaff35f2ee3bebfd946f6dc7bf8091bbe640dde8196d99bb"} Dec 02 20:57:31 crc kubenswrapper[4807]: I1202 20:57:31.964643 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbmxp" event={"ID":"dbc1866a-3ae4-4965-9794-32c61e384262","Type":"ContainerStarted","Data":"2f6fc7bdaf4750bbfce63b97a2e7ee20dfcff120b3fc6babd9c55f8dfc51f5fb"} Dec 02 20:57:32 crc kubenswrapper[4807]: I1202 20:57:32.032246 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wbmxp" podStartSLOduration=3.108357247 podStartE2EDuration="10.032221013s" podCreationTimestamp="2025-12-02 20:57:22 +0000 UTC" firstStartedPulling="2025-12-02 20:57:24.784168489 +0000 UTC m=+3580.085075994" lastFinishedPulling="2025-12-02 20:57:31.708032265 +0000 UTC m=+3587.008939760" observedRunningTime="2025-12-02 20:57:32.020109234 +0000 UTC m=+3587.321016749" watchObservedRunningTime="2025-12-02 20:57:32.032221013 +0000 UTC m=+3587.333128508" Dec 02 20:57:32 crc kubenswrapper[4807]: E1202 20:57:32.353845 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e62aa90_5be8_469a_b984_6adcb92e4a91.slice/crio-360867d23729c69afaff35f2ee3bebfd946f6dc7bf8091bbe640dde8196d99bb.scope\": RecentStats: unable to find data in memory cache]" Dec 02 20:57:32 crc kubenswrapper[4807]: I1202 20:57:32.739031 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:32 crc kubenswrapper[4807]: I1202 20:57:32.739427 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:33 crc kubenswrapper[4807]: I1202 20:57:33.799377 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-wbmxp" podUID="dbc1866a-3ae4-4965-9794-32c61e384262" containerName="registry-server" probeResult="failure" output=< Dec 02 20:57:33 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Dec 02 20:57:33 crc kubenswrapper[4807]: > Dec 02 20:57:34 crc kubenswrapper[4807]: I1202 20:57:34.002423 4807 generic.go:334] "Generic (PLEG): container finished" podID="7e62aa90-5be8-469a-b984-6adcb92e4a91" containerID="360867d23729c69afaff35f2ee3bebfd946f6dc7bf8091bbe640dde8196d99bb" exitCode=0 Dec 02 20:57:34 crc kubenswrapper[4807]: I1202 20:57:34.002522 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cr7xz" event={"ID":"7e62aa90-5be8-469a-b984-6adcb92e4a91","Type":"ContainerDied","Data":"360867d23729c69afaff35f2ee3bebfd946f6dc7bf8091bbe640dde8196d99bb"} Dec 02 20:57:35 crc kubenswrapper[4807]: I1202 20:57:35.015351 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cr7xz" event={"ID":"7e62aa90-5be8-469a-b984-6adcb92e4a91","Type":"ContainerStarted","Data":"c4f1c24df6b17fa0c07f299405aeeba203386417010e00fdfe19de9a591548cd"} Dec 02 20:57:35 crc kubenswrapper[4807]: I1202 20:57:35.045993 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cr7xz" podStartSLOduration=3.212410105 podStartE2EDuration="17.045960069s" podCreationTimestamp="2025-12-02 20:57:18 +0000 UTC" firstStartedPulling="2025-12-02 20:57:20.72450474 +0000 UTC m=+3576.025412235" lastFinishedPulling="2025-12-02 20:57:34.558054694 +0000 UTC m=+3589.858962199" observedRunningTime="2025-12-02 20:57:35.033948942 +0000 UTC m=+3590.334856457" watchObservedRunningTime="2025-12-02 20:57:35.045960069 +0000 UTC m=+3590.346867604" Dec 02 20:57:39 crc kubenswrapper[4807]: I1202 20:57:39.153521 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 20:57:39 crc kubenswrapper[4807]: I1202 20:57:39.153909 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 20:57:39 crc kubenswrapper[4807]: I1202 20:57:39.989176 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:40 crc kubenswrapper[4807]: I1202 20:57:40.039552 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:40 crc kubenswrapper[4807]: I1202 20:57:40.204468 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cr7xz" podUID="7e62aa90-5be8-469a-b984-6adcb92e4a91" containerName="registry-server" probeResult="failure" output=< Dec 02 20:57:40 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Dec 02 20:57:40 crc kubenswrapper[4807]: > Dec 02 20:57:40 crc kubenswrapper[4807]: I1202 20:57:40.233670 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-94g68"] Dec 02 20:57:41 crc kubenswrapper[4807]: I1202 20:57:41.082712 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-94g68" podUID="162650fa-0e30-49ff-8cf9-fdee15219f64" containerName="registry-server" containerID="cri-o://bc0982db5164ccd2433536b7c5ebbe1c98f33a412c1d2740302bbd0e160bafcd" gracePeriod=2 Dec 02 20:57:41 crc kubenswrapper[4807]: I1202 20:57:41.790653 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:41 crc kubenswrapper[4807]: I1202 20:57:41.893747 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/162650fa-0e30-49ff-8cf9-fdee15219f64-utilities\") pod \"162650fa-0e30-49ff-8cf9-fdee15219f64\" (UID: \"162650fa-0e30-49ff-8cf9-fdee15219f64\") " Dec 02 20:57:41 crc kubenswrapper[4807]: I1202 20:57:41.893878 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjck8\" (UniqueName: \"kubernetes.io/projected/162650fa-0e30-49ff-8cf9-fdee15219f64-kube-api-access-tjck8\") pod \"162650fa-0e30-49ff-8cf9-fdee15219f64\" (UID: \"162650fa-0e30-49ff-8cf9-fdee15219f64\") " Dec 02 20:57:41 crc kubenswrapper[4807]: I1202 20:57:41.893977 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/162650fa-0e30-49ff-8cf9-fdee15219f64-catalog-content\") pod \"162650fa-0e30-49ff-8cf9-fdee15219f64\" (UID: \"162650fa-0e30-49ff-8cf9-fdee15219f64\") " Dec 02 20:57:41 crc kubenswrapper[4807]: I1202 20:57:41.894861 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/162650fa-0e30-49ff-8cf9-fdee15219f64-utilities" (OuterVolumeSpecName: "utilities") pod "162650fa-0e30-49ff-8cf9-fdee15219f64" (UID: "162650fa-0e30-49ff-8cf9-fdee15219f64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:57:41 crc kubenswrapper[4807]: I1202 20:57:41.900325 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/162650fa-0e30-49ff-8cf9-fdee15219f64-kube-api-access-tjck8" (OuterVolumeSpecName: "kube-api-access-tjck8") pod "162650fa-0e30-49ff-8cf9-fdee15219f64" (UID: "162650fa-0e30-49ff-8cf9-fdee15219f64"). InnerVolumeSpecName "kube-api-access-tjck8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:57:41 crc kubenswrapper[4807]: I1202 20:57:41.940574 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/162650fa-0e30-49ff-8cf9-fdee15219f64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "162650fa-0e30-49ff-8cf9-fdee15219f64" (UID: "162650fa-0e30-49ff-8cf9-fdee15219f64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:57:41 crc kubenswrapper[4807]: I1202 20:57:41.997367 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/162650fa-0e30-49ff-8cf9-fdee15219f64-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:57:41 crc kubenswrapper[4807]: I1202 20:57:41.997443 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/162650fa-0e30-49ff-8cf9-fdee15219f64-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:57:41 crc kubenswrapper[4807]: I1202 20:57:41.997470 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjck8\" (UniqueName: \"kubernetes.io/projected/162650fa-0e30-49ff-8cf9-fdee15219f64-kube-api-access-tjck8\") on node \"crc\" DevicePath \"\"" Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.097781 4807 generic.go:334] "Generic (PLEG): container finished" podID="162650fa-0e30-49ff-8cf9-fdee15219f64" containerID="bc0982db5164ccd2433536b7c5ebbe1c98f33a412c1d2740302bbd0e160bafcd" exitCode=0 Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.097843 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94g68" event={"ID":"162650fa-0e30-49ff-8cf9-fdee15219f64","Type":"ContainerDied","Data":"bc0982db5164ccd2433536b7c5ebbe1c98f33a412c1d2740302bbd0e160bafcd"} Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.097937 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94g68" event={"ID":"162650fa-0e30-49ff-8cf9-fdee15219f64","Type":"ContainerDied","Data":"3661f725e6f4ee01d3d98efdbc76bd40c7e5c4b5bc7025515c26b0d3664fa733"} Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.097928 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94g68" Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.097977 4807 scope.go:117] "RemoveContainer" containerID="bc0982db5164ccd2433536b7c5ebbe1c98f33a412c1d2740302bbd0e160bafcd" Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.149033 4807 scope.go:117] "RemoveContainer" containerID="6d53c057ae642d32be84031ced64e10e9b660a141dd222d074a05a2ae40b25c9" Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.160946 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-94g68"] Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.173614 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-94g68"] Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.175567 4807 scope.go:117] "RemoveContainer" containerID="7a0a3cea37b236ea5f8e49a99bcb7faed146129b751e1bb6cf3290ca9ef3d747" Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.228595 4807 scope.go:117] "RemoveContainer" containerID="bc0982db5164ccd2433536b7c5ebbe1c98f33a412c1d2740302bbd0e160bafcd" Dec 02 20:57:42 crc kubenswrapper[4807]: E1202 20:57:42.229369 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc0982db5164ccd2433536b7c5ebbe1c98f33a412c1d2740302bbd0e160bafcd\": container with ID starting with bc0982db5164ccd2433536b7c5ebbe1c98f33a412c1d2740302bbd0e160bafcd not found: ID does not exist" containerID="bc0982db5164ccd2433536b7c5ebbe1c98f33a412c1d2740302bbd0e160bafcd" Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.229414 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc0982db5164ccd2433536b7c5ebbe1c98f33a412c1d2740302bbd0e160bafcd"} err="failed to get container status \"bc0982db5164ccd2433536b7c5ebbe1c98f33a412c1d2740302bbd0e160bafcd\": rpc error: code = NotFound desc = could not find container \"bc0982db5164ccd2433536b7c5ebbe1c98f33a412c1d2740302bbd0e160bafcd\": container with ID starting with bc0982db5164ccd2433536b7c5ebbe1c98f33a412c1d2740302bbd0e160bafcd not found: ID does not exist" Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.229446 4807 scope.go:117] "RemoveContainer" containerID="6d53c057ae642d32be84031ced64e10e9b660a141dd222d074a05a2ae40b25c9" Dec 02 20:57:42 crc kubenswrapper[4807]: E1202 20:57:42.229909 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d53c057ae642d32be84031ced64e10e9b660a141dd222d074a05a2ae40b25c9\": container with ID starting with 6d53c057ae642d32be84031ced64e10e9b660a141dd222d074a05a2ae40b25c9 not found: ID does not exist" containerID="6d53c057ae642d32be84031ced64e10e9b660a141dd222d074a05a2ae40b25c9" Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.229931 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d53c057ae642d32be84031ced64e10e9b660a141dd222d074a05a2ae40b25c9"} err="failed to get container status \"6d53c057ae642d32be84031ced64e10e9b660a141dd222d074a05a2ae40b25c9\": rpc error: code = NotFound desc = could not find container \"6d53c057ae642d32be84031ced64e10e9b660a141dd222d074a05a2ae40b25c9\": container with ID starting with 6d53c057ae642d32be84031ced64e10e9b660a141dd222d074a05a2ae40b25c9 not found: ID does not exist" Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.229944 4807 scope.go:117] "RemoveContainer" containerID="7a0a3cea37b236ea5f8e49a99bcb7faed146129b751e1bb6cf3290ca9ef3d747" Dec 02 20:57:42 crc kubenswrapper[4807]: E1202 20:57:42.230477 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a0a3cea37b236ea5f8e49a99bcb7faed146129b751e1bb6cf3290ca9ef3d747\": container with ID starting with 7a0a3cea37b236ea5f8e49a99bcb7faed146129b751e1bb6cf3290ca9ef3d747 not found: ID does not exist" containerID="7a0a3cea37b236ea5f8e49a99bcb7faed146129b751e1bb6cf3290ca9ef3d747" Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.230657 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a0a3cea37b236ea5f8e49a99bcb7faed146129b751e1bb6cf3290ca9ef3d747"} err="failed to get container status \"7a0a3cea37b236ea5f8e49a99bcb7faed146129b751e1bb6cf3290ca9ef3d747\": rpc error: code = NotFound desc = could not find container \"7a0a3cea37b236ea5f8e49a99bcb7faed146129b751e1bb6cf3290ca9ef3d747\": container with ID starting with 7a0a3cea37b236ea5f8e49a99bcb7faed146129b751e1bb6cf3290ca9ef3d747 not found: ID does not exist" Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.802902 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.874678 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:42 crc kubenswrapper[4807]: I1202 20:57:42.985199 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="162650fa-0e30-49ff-8cf9-fdee15219f64" path="/var/lib/kubelet/pods/162650fa-0e30-49ff-8cf9-fdee15219f64/volumes" Dec 02 20:57:44 crc kubenswrapper[4807]: I1202 20:57:44.640770 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbmxp"] Dec 02 20:57:44 crc kubenswrapper[4807]: I1202 20:57:44.642497 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wbmxp" podUID="dbc1866a-3ae4-4965-9794-32c61e384262" containerName="registry-server" containerID="cri-o://2f6fc7bdaf4750bbfce63b97a2e7ee20dfcff120b3fc6babd9c55f8dfc51f5fb" gracePeriod=2 Dec 02 20:57:45 crc kubenswrapper[4807]: I1202 20:57:45.142417 4807 generic.go:334] "Generic (PLEG): container finished" podID="dbc1866a-3ae4-4965-9794-32c61e384262" containerID="2f6fc7bdaf4750bbfce63b97a2e7ee20dfcff120b3fc6babd9c55f8dfc51f5fb" exitCode=0 Dec 02 20:57:45 crc kubenswrapper[4807]: I1202 20:57:45.142527 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbmxp" event={"ID":"dbc1866a-3ae4-4965-9794-32c61e384262","Type":"ContainerDied","Data":"2f6fc7bdaf4750bbfce63b97a2e7ee20dfcff120b3fc6babd9c55f8dfc51f5fb"} Dec 02 20:57:45 crc kubenswrapper[4807]: I1202 20:57:45.237699 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:45 crc kubenswrapper[4807]: I1202 20:57:45.272679 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8h6p\" (UniqueName: \"kubernetes.io/projected/dbc1866a-3ae4-4965-9794-32c61e384262-kube-api-access-m8h6p\") pod \"dbc1866a-3ae4-4965-9794-32c61e384262\" (UID: \"dbc1866a-3ae4-4965-9794-32c61e384262\") " Dec 02 20:57:45 crc kubenswrapper[4807]: I1202 20:57:45.272972 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbc1866a-3ae4-4965-9794-32c61e384262-catalog-content\") pod \"dbc1866a-3ae4-4965-9794-32c61e384262\" (UID: \"dbc1866a-3ae4-4965-9794-32c61e384262\") " Dec 02 20:57:45 crc kubenswrapper[4807]: I1202 20:57:45.273073 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbc1866a-3ae4-4965-9794-32c61e384262-utilities\") pod \"dbc1866a-3ae4-4965-9794-32c61e384262\" (UID: \"dbc1866a-3ae4-4965-9794-32c61e384262\") " Dec 02 20:57:45 crc kubenswrapper[4807]: I1202 20:57:45.274398 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbc1866a-3ae4-4965-9794-32c61e384262-utilities" (OuterVolumeSpecName: "utilities") pod "dbc1866a-3ae4-4965-9794-32c61e384262" (UID: "dbc1866a-3ae4-4965-9794-32c61e384262"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:57:45 crc kubenswrapper[4807]: I1202 20:57:45.280127 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbc1866a-3ae4-4965-9794-32c61e384262-kube-api-access-m8h6p" (OuterVolumeSpecName: "kube-api-access-m8h6p") pod "dbc1866a-3ae4-4965-9794-32c61e384262" (UID: "dbc1866a-3ae4-4965-9794-32c61e384262"). InnerVolumeSpecName "kube-api-access-m8h6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:57:45 crc kubenswrapper[4807]: I1202 20:57:45.292453 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbc1866a-3ae4-4965-9794-32c61e384262-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbc1866a-3ae4-4965-9794-32c61e384262" (UID: "dbc1866a-3ae4-4965-9794-32c61e384262"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:57:45 crc kubenswrapper[4807]: I1202 20:57:45.375402 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8h6p\" (UniqueName: \"kubernetes.io/projected/dbc1866a-3ae4-4965-9794-32c61e384262-kube-api-access-m8h6p\") on node \"crc\" DevicePath \"\"" Dec 02 20:57:45 crc kubenswrapper[4807]: I1202 20:57:45.375442 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbc1866a-3ae4-4965-9794-32c61e384262-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:57:45 crc kubenswrapper[4807]: I1202 20:57:45.375453 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbc1866a-3ae4-4965-9794-32c61e384262-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:57:46 crc kubenswrapper[4807]: I1202 20:57:46.154979 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wbmxp" event={"ID":"dbc1866a-3ae4-4965-9794-32c61e384262","Type":"ContainerDied","Data":"dc9754461a4dcd30b5a2cefc6383cb0a759aae448e34814b2c916fdcd4f610eb"} Dec 02 20:57:46 crc kubenswrapper[4807]: I1202 20:57:46.155181 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wbmxp" Dec 02 20:57:46 crc kubenswrapper[4807]: I1202 20:57:46.155262 4807 scope.go:117] "RemoveContainer" containerID="2f6fc7bdaf4750bbfce63b97a2e7ee20dfcff120b3fc6babd9c55f8dfc51f5fb" Dec 02 20:57:46 crc kubenswrapper[4807]: I1202 20:57:46.181946 4807 scope.go:117] "RemoveContainer" containerID="9dd1dbcc65d6f5c454522680f059348fd8f6be09a2668bd176289d2f816f63a1" Dec 02 20:57:46 crc kubenswrapper[4807]: I1202 20:57:46.208793 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbmxp"] Dec 02 20:57:46 crc kubenswrapper[4807]: I1202 20:57:46.220181 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wbmxp"] Dec 02 20:57:46 crc kubenswrapper[4807]: I1202 20:57:46.227736 4807 scope.go:117] "RemoveContainer" containerID="d3b0213367e73f7a21e7f9d89f7d435579921feee5a0463c1a89648fca44df6f" Dec 02 20:57:46 crc kubenswrapper[4807]: I1202 20:57:46.987621 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbc1866a-3ae4-4965-9794-32c61e384262" path="/var/lib/kubelet/pods/dbc1866a-3ae4-4965-9794-32c61e384262/volumes" Dec 02 20:57:49 crc kubenswrapper[4807]: I1202 20:57:49.228611 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 20:57:49 crc kubenswrapper[4807]: I1202 20:57:49.277921 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 20:57:52 crc kubenswrapper[4807]: I1202 20:57:52.673914 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cr7xz"] Dec 02 20:57:53 crc kubenswrapper[4807]: I1202 20:57:53.443501 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kpfcc"] Dec 02 20:57:53 crc kubenswrapper[4807]: I1202 20:57:53.444245 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kpfcc" podUID="7a430d81-93d6-44ac-b492-762898abc32c" containerName="registry-server" containerID="cri-o://8c398d41d1411cc386d3314cc90c7dc4404b318131f5f4ada44ed947a22218c2" gracePeriod=2 Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.022349 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.130650 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlhg4\" (UniqueName: \"kubernetes.io/projected/7a430d81-93d6-44ac-b492-762898abc32c-kube-api-access-vlhg4\") pod \"7a430d81-93d6-44ac-b492-762898abc32c\" (UID: \"7a430d81-93d6-44ac-b492-762898abc32c\") " Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.130771 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a430d81-93d6-44ac-b492-762898abc32c-utilities\") pod \"7a430d81-93d6-44ac-b492-762898abc32c\" (UID: \"7a430d81-93d6-44ac-b492-762898abc32c\") " Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.130994 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a430d81-93d6-44ac-b492-762898abc32c-catalog-content\") pod \"7a430d81-93d6-44ac-b492-762898abc32c\" (UID: \"7a430d81-93d6-44ac-b492-762898abc32c\") " Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.131220 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a430d81-93d6-44ac-b492-762898abc32c-utilities" (OuterVolumeSpecName: "utilities") pod "7a430d81-93d6-44ac-b492-762898abc32c" (UID: "7a430d81-93d6-44ac-b492-762898abc32c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.131870 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a430d81-93d6-44ac-b492-762898abc32c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.137994 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a430d81-93d6-44ac-b492-762898abc32c-kube-api-access-vlhg4" (OuterVolumeSpecName: "kube-api-access-vlhg4") pod "7a430d81-93d6-44ac-b492-762898abc32c" (UID: "7a430d81-93d6-44ac-b492-762898abc32c"). InnerVolumeSpecName "kube-api-access-vlhg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.231368 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a430d81-93d6-44ac-b492-762898abc32c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a430d81-93d6-44ac-b492-762898abc32c" (UID: "7a430d81-93d6-44ac-b492-762898abc32c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.234583 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlhg4\" (UniqueName: \"kubernetes.io/projected/7a430d81-93d6-44ac-b492-762898abc32c-kube-api-access-vlhg4\") on node \"crc\" DevicePath \"\"" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.234639 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a430d81-93d6-44ac-b492-762898abc32c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.249784 4807 generic.go:334] "Generic (PLEG): container finished" podID="7a430d81-93d6-44ac-b492-762898abc32c" containerID="8c398d41d1411cc386d3314cc90c7dc4404b318131f5f4ada44ed947a22218c2" exitCode=0 Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.249830 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpfcc" event={"ID":"7a430d81-93d6-44ac-b492-762898abc32c","Type":"ContainerDied","Data":"8c398d41d1411cc386d3314cc90c7dc4404b318131f5f4ada44ed947a22218c2"} Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.249865 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kpfcc" event={"ID":"7a430d81-93d6-44ac-b492-762898abc32c","Type":"ContainerDied","Data":"4f30abe756848ac46d4b5308ad7e7bd38f4cd1f9ff891efd0fd2710aa6e1349a"} Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.249885 4807 scope.go:117] "RemoveContainer" containerID="8c398d41d1411cc386d3314cc90c7dc4404b318131f5f4ada44ed947a22218c2" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.249937 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kpfcc" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.271877 4807 scope.go:117] "RemoveContainer" containerID="5f1b7f45acf334643a5fc2e178362676a655aebdb40edda944292e49893b39e0" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.292134 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kpfcc"] Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.299454 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kpfcc"] Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.307872 4807 scope.go:117] "RemoveContainer" containerID="83fa0c73d92c61be80bf6eace0cbf8fb07f4c5af838051eed16f909234846b50" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.361385 4807 scope.go:117] "RemoveContainer" containerID="8c398d41d1411cc386d3314cc90c7dc4404b318131f5f4ada44ed947a22218c2" Dec 02 20:57:54 crc kubenswrapper[4807]: E1202 20:57:54.363276 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c398d41d1411cc386d3314cc90c7dc4404b318131f5f4ada44ed947a22218c2\": container with ID starting with 8c398d41d1411cc386d3314cc90c7dc4404b318131f5f4ada44ed947a22218c2 not found: ID does not exist" containerID="8c398d41d1411cc386d3314cc90c7dc4404b318131f5f4ada44ed947a22218c2" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.363311 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c398d41d1411cc386d3314cc90c7dc4404b318131f5f4ada44ed947a22218c2"} err="failed to get container status \"8c398d41d1411cc386d3314cc90c7dc4404b318131f5f4ada44ed947a22218c2\": rpc error: code = NotFound desc = could not find container \"8c398d41d1411cc386d3314cc90c7dc4404b318131f5f4ada44ed947a22218c2\": container with ID starting with 8c398d41d1411cc386d3314cc90c7dc4404b318131f5f4ada44ed947a22218c2 not found: ID does not exist" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.363335 4807 scope.go:117] "RemoveContainer" containerID="5f1b7f45acf334643a5fc2e178362676a655aebdb40edda944292e49893b39e0" Dec 02 20:57:54 crc kubenswrapper[4807]: E1202 20:57:54.364354 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f1b7f45acf334643a5fc2e178362676a655aebdb40edda944292e49893b39e0\": container with ID starting with 5f1b7f45acf334643a5fc2e178362676a655aebdb40edda944292e49893b39e0 not found: ID does not exist" containerID="5f1b7f45acf334643a5fc2e178362676a655aebdb40edda944292e49893b39e0" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.364378 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1b7f45acf334643a5fc2e178362676a655aebdb40edda944292e49893b39e0"} err="failed to get container status \"5f1b7f45acf334643a5fc2e178362676a655aebdb40edda944292e49893b39e0\": rpc error: code = NotFound desc = could not find container \"5f1b7f45acf334643a5fc2e178362676a655aebdb40edda944292e49893b39e0\": container with ID starting with 5f1b7f45acf334643a5fc2e178362676a655aebdb40edda944292e49893b39e0 not found: ID does not exist" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.364390 4807 scope.go:117] "RemoveContainer" containerID="83fa0c73d92c61be80bf6eace0cbf8fb07f4c5af838051eed16f909234846b50" Dec 02 20:57:54 crc kubenswrapper[4807]: E1202 20:57:54.365005 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83fa0c73d92c61be80bf6eace0cbf8fb07f4c5af838051eed16f909234846b50\": container with ID starting with 83fa0c73d92c61be80bf6eace0cbf8fb07f4c5af838051eed16f909234846b50 not found: ID does not exist" containerID="83fa0c73d92c61be80bf6eace0cbf8fb07f4c5af838051eed16f909234846b50" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.365045 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83fa0c73d92c61be80bf6eace0cbf8fb07f4c5af838051eed16f909234846b50"} err="failed to get container status \"83fa0c73d92c61be80bf6eace0cbf8fb07f4c5af838051eed16f909234846b50\": rpc error: code = NotFound desc = could not find container \"83fa0c73d92c61be80bf6eace0cbf8fb07f4c5af838051eed16f909234846b50\": container with ID starting with 83fa0c73d92c61be80bf6eace0cbf8fb07f4c5af838051eed16f909234846b50 not found: ID does not exist" Dec 02 20:57:54 crc kubenswrapper[4807]: I1202 20:57:54.982836 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a430d81-93d6-44ac-b492-762898abc32c" path="/var/lib/kubelet/pods/7a430d81-93d6-44ac-b492-762898abc32c/volumes" Dec 02 20:57:58 crc kubenswrapper[4807]: I1202 20:57:58.292906 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:57:58 crc kubenswrapper[4807]: I1202 20:57:58.293321 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:58:28 crc kubenswrapper[4807]: I1202 20:58:28.293720 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:58:28 crc kubenswrapper[4807]: I1202 20:58:28.294400 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:58:58 crc kubenswrapper[4807]: I1202 20:58:58.292975 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 20:58:58 crc kubenswrapper[4807]: I1202 20:58:58.293495 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 20:58:58 crc kubenswrapper[4807]: I1202 20:58:58.293554 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 20:58:58 crc kubenswrapper[4807]: I1202 20:58:58.294519 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 20:58:58 crc kubenswrapper[4807]: I1202 20:58:58.294597 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" gracePeriod=600 Dec 02 20:58:58 crc kubenswrapper[4807]: E1202 20:58:58.432641 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:58:59 crc kubenswrapper[4807]: I1202 20:58:59.031922 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" exitCode=0 Dec 02 20:58:59 crc kubenswrapper[4807]: I1202 20:58:59.032004 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0"} Dec 02 20:58:59 crc kubenswrapper[4807]: I1202 20:58:59.032214 4807 scope.go:117] "RemoveContainer" containerID="1dfd2daec5bfc89c1478b17450a777538e2101d84f5e4b5fbfbaa5205355f2b0" Dec 02 20:58:59 crc kubenswrapper[4807]: I1202 20:58:59.032811 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 20:58:59 crc kubenswrapper[4807]: E1202 20:58:59.033070 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:59:13 crc kubenswrapper[4807]: I1202 20:59:13.972355 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 20:59:13 crc kubenswrapper[4807]: E1202 20:59:13.973155 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:59:27 crc kubenswrapper[4807]: I1202 20:59:27.972445 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 20:59:27 crc kubenswrapper[4807]: E1202 20:59:27.973325 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:59:39 crc kubenswrapper[4807]: I1202 20:59:39.972511 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 20:59:39 crc kubenswrapper[4807]: E1202 20:59:39.973688 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 20:59:50 crc kubenswrapper[4807]: I1202 20:59:50.973366 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 20:59:50 crc kubenswrapper[4807]: E1202 20:59:50.974173 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.198105 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh"] Dec 02 21:00:00 crc kubenswrapper[4807]: E1202 21:00:00.199338 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a430d81-93d6-44ac-b492-762898abc32c" containerName="registry-server" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.199362 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a430d81-93d6-44ac-b492-762898abc32c" containerName="registry-server" Dec 02 21:00:00 crc kubenswrapper[4807]: E1202 21:00:00.199381 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a430d81-93d6-44ac-b492-762898abc32c" containerName="extract-content" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.199392 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a430d81-93d6-44ac-b492-762898abc32c" containerName="extract-content" Dec 02 21:00:00 crc kubenswrapper[4807]: E1202 21:00:00.199413 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="162650fa-0e30-49ff-8cf9-fdee15219f64" containerName="extract-content" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.199424 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="162650fa-0e30-49ff-8cf9-fdee15219f64" containerName="extract-content" Dec 02 21:00:00 crc kubenswrapper[4807]: E1202 21:00:00.199443 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc1866a-3ae4-4965-9794-32c61e384262" containerName="registry-server" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.199454 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc1866a-3ae4-4965-9794-32c61e384262" containerName="registry-server" Dec 02 21:00:00 crc kubenswrapper[4807]: E1202 21:00:00.199480 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc1866a-3ae4-4965-9794-32c61e384262" containerName="extract-utilities" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.199491 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc1866a-3ae4-4965-9794-32c61e384262" containerName="extract-utilities" Dec 02 21:00:00 crc kubenswrapper[4807]: E1202 21:00:00.199505 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="162650fa-0e30-49ff-8cf9-fdee15219f64" containerName="extract-utilities" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.199516 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="162650fa-0e30-49ff-8cf9-fdee15219f64" containerName="extract-utilities" Dec 02 21:00:00 crc kubenswrapper[4807]: E1202 21:00:00.199546 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc1866a-3ae4-4965-9794-32c61e384262" containerName="extract-content" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.199556 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc1866a-3ae4-4965-9794-32c61e384262" containerName="extract-content" Dec 02 21:00:00 crc kubenswrapper[4807]: E1202 21:00:00.199578 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="162650fa-0e30-49ff-8cf9-fdee15219f64" containerName="registry-server" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.199589 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="162650fa-0e30-49ff-8cf9-fdee15219f64" containerName="registry-server" Dec 02 21:00:00 crc kubenswrapper[4807]: E1202 21:00:00.199616 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a430d81-93d6-44ac-b492-762898abc32c" containerName="extract-utilities" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.199626 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a430d81-93d6-44ac-b492-762898abc32c" containerName="extract-utilities" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.199981 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="162650fa-0e30-49ff-8cf9-fdee15219f64" containerName="registry-server" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.199997 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbc1866a-3ae4-4965-9794-32c61e384262" containerName="registry-server" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.200030 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a430d81-93d6-44ac-b492-762898abc32c" containerName="registry-server" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.200969 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.203659 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.203669 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.211840 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh"] Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.345333 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d0d1660-0666-4134-a153-421751d4eff4-secret-volume\") pod \"collect-profiles-29411820-f5psh\" (UID: \"6d0d1660-0666-4134-a153-421751d4eff4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.345490 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrlj2\" (UniqueName: \"kubernetes.io/projected/6d0d1660-0666-4134-a153-421751d4eff4-kube-api-access-qrlj2\") pod \"collect-profiles-29411820-f5psh\" (UID: \"6d0d1660-0666-4134-a153-421751d4eff4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.345520 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d0d1660-0666-4134-a153-421751d4eff4-config-volume\") pod \"collect-profiles-29411820-f5psh\" (UID: \"6d0d1660-0666-4134-a153-421751d4eff4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.447569 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d0d1660-0666-4134-a153-421751d4eff4-secret-volume\") pod \"collect-profiles-29411820-f5psh\" (UID: \"6d0d1660-0666-4134-a153-421751d4eff4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.447658 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrlj2\" (UniqueName: \"kubernetes.io/projected/6d0d1660-0666-4134-a153-421751d4eff4-kube-api-access-qrlj2\") pod \"collect-profiles-29411820-f5psh\" (UID: \"6d0d1660-0666-4134-a153-421751d4eff4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.447689 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d0d1660-0666-4134-a153-421751d4eff4-config-volume\") pod \"collect-profiles-29411820-f5psh\" (UID: \"6d0d1660-0666-4134-a153-421751d4eff4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.448538 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d0d1660-0666-4134-a153-421751d4eff4-config-volume\") pod \"collect-profiles-29411820-f5psh\" (UID: \"6d0d1660-0666-4134-a153-421751d4eff4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.454752 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d0d1660-0666-4134-a153-421751d4eff4-secret-volume\") pod \"collect-profiles-29411820-f5psh\" (UID: \"6d0d1660-0666-4134-a153-421751d4eff4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.477907 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrlj2\" (UniqueName: \"kubernetes.io/projected/6d0d1660-0666-4134-a153-421751d4eff4-kube-api-access-qrlj2\") pod \"collect-profiles-29411820-f5psh\" (UID: \"6d0d1660-0666-4134-a153-421751d4eff4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh" Dec 02 21:00:00 crc kubenswrapper[4807]: I1202 21:00:00.528609 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh" Dec 02 21:00:01 crc kubenswrapper[4807]: I1202 21:00:01.008292 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh"] Dec 02 21:00:01 crc kubenswrapper[4807]: I1202 21:00:01.775921 4807 generic.go:334] "Generic (PLEG): container finished" podID="6d0d1660-0666-4134-a153-421751d4eff4" containerID="fce359c006c6e47702482848da026e5446512965b6e4560380ededb49cc4917a" exitCode=0 Dec 02 21:00:01 crc kubenswrapper[4807]: I1202 21:00:01.776030 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh" event={"ID":"6d0d1660-0666-4134-a153-421751d4eff4","Type":"ContainerDied","Data":"fce359c006c6e47702482848da026e5446512965b6e4560380ededb49cc4917a"} Dec 02 21:00:01 crc kubenswrapper[4807]: I1202 21:00:01.776360 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh" event={"ID":"6d0d1660-0666-4134-a153-421751d4eff4","Type":"ContainerStarted","Data":"21902fd7a38127553131afb346579de650d3bd4aad630113abc105cc0fa2f83f"} Dec 02 21:00:03 crc kubenswrapper[4807]: I1202 21:00:03.379150 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh" Dec 02 21:00:03 crc kubenswrapper[4807]: I1202 21:00:03.531681 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d0d1660-0666-4134-a153-421751d4eff4-secret-volume\") pod \"6d0d1660-0666-4134-a153-421751d4eff4\" (UID: \"6d0d1660-0666-4134-a153-421751d4eff4\") " Dec 02 21:00:03 crc kubenswrapper[4807]: I1202 21:00:03.531923 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d0d1660-0666-4134-a153-421751d4eff4-config-volume\") pod \"6d0d1660-0666-4134-a153-421751d4eff4\" (UID: \"6d0d1660-0666-4134-a153-421751d4eff4\") " Dec 02 21:00:03 crc kubenswrapper[4807]: I1202 21:00:03.531977 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrlj2\" (UniqueName: \"kubernetes.io/projected/6d0d1660-0666-4134-a153-421751d4eff4-kube-api-access-qrlj2\") pod \"6d0d1660-0666-4134-a153-421751d4eff4\" (UID: \"6d0d1660-0666-4134-a153-421751d4eff4\") " Dec 02 21:00:03 crc kubenswrapper[4807]: I1202 21:00:03.532570 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d0d1660-0666-4134-a153-421751d4eff4-config-volume" (OuterVolumeSpecName: "config-volume") pod "6d0d1660-0666-4134-a153-421751d4eff4" (UID: "6d0d1660-0666-4134-a153-421751d4eff4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 21:00:03 crc kubenswrapper[4807]: I1202 21:00:03.539490 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d0d1660-0666-4134-a153-421751d4eff4-kube-api-access-qrlj2" (OuterVolumeSpecName: "kube-api-access-qrlj2") pod "6d0d1660-0666-4134-a153-421751d4eff4" (UID: "6d0d1660-0666-4134-a153-421751d4eff4"). InnerVolumeSpecName "kube-api-access-qrlj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:00:03 crc kubenswrapper[4807]: I1202 21:00:03.548513 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d0d1660-0666-4134-a153-421751d4eff4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6d0d1660-0666-4134-a153-421751d4eff4" (UID: "6d0d1660-0666-4134-a153-421751d4eff4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 21:00:03 crc kubenswrapper[4807]: I1202 21:00:03.635111 4807 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d0d1660-0666-4134-a153-421751d4eff4-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 21:00:03 crc kubenswrapper[4807]: I1202 21:00:03.635165 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrlj2\" (UniqueName: \"kubernetes.io/projected/6d0d1660-0666-4134-a153-421751d4eff4-kube-api-access-qrlj2\") on node \"crc\" DevicePath \"\"" Dec 02 21:00:03 crc kubenswrapper[4807]: I1202 21:00:03.635185 4807 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d0d1660-0666-4134-a153-421751d4eff4-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 21:00:03 crc kubenswrapper[4807]: I1202 21:00:03.809802 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh" event={"ID":"6d0d1660-0666-4134-a153-421751d4eff4","Type":"ContainerDied","Data":"21902fd7a38127553131afb346579de650d3bd4aad630113abc105cc0fa2f83f"} Dec 02 21:00:03 crc kubenswrapper[4807]: I1202 21:00:03.809899 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21902fd7a38127553131afb346579de650d3bd4aad630113abc105cc0fa2f83f" Dec 02 21:00:03 crc kubenswrapper[4807]: I1202 21:00:03.809951 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh" Dec 02 21:00:04 crc kubenswrapper[4807]: I1202 21:00:04.471631 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl"] Dec 02 21:00:04 crc kubenswrapper[4807]: I1202 21:00:04.485853 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411775-nngpl"] Dec 02 21:00:04 crc kubenswrapper[4807]: I1202 21:00:04.985110 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82089ba4-d2c7-49e1-96e0-bf2d1a082aa0" path="/var/lib/kubelet/pods/82089ba4-d2c7-49e1-96e0-bf2d1a082aa0/volumes" Dec 02 21:00:05 crc kubenswrapper[4807]: I1202 21:00:05.973344 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:00:05 crc kubenswrapper[4807]: E1202 21:00:05.974144 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:00:12 crc kubenswrapper[4807]: I1202 21:00:12.498267 4807 scope.go:117] "RemoveContainer" containerID="97304526de2b565171d77469b97822d3affa6c08115f8ec5697384099d48b94b" Dec 02 21:00:19 crc kubenswrapper[4807]: I1202 21:00:19.973064 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:00:19 crc kubenswrapper[4807]: E1202 21:00:19.974453 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:00:30 crc kubenswrapper[4807]: I1202 21:00:30.973628 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:00:30 crc kubenswrapper[4807]: E1202 21:00:30.974978 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:00:38 crc kubenswrapper[4807]: I1202 21:00:38.269301 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p4kqg"] Dec 02 21:00:38 crc kubenswrapper[4807]: E1202 21:00:38.270253 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d0d1660-0666-4134-a153-421751d4eff4" containerName="collect-profiles" Dec 02 21:00:38 crc kubenswrapper[4807]: I1202 21:00:38.270266 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d0d1660-0666-4134-a153-421751d4eff4" containerName="collect-profiles" Dec 02 21:00:38 crc kubenswrapper[4807]: I1202 21:00:38.270483 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d0d1660-0666-4134-a153-421751d4eff4" containerName="collect-profiles" Dec 02 21:00:38 crc kubenswrapper[4807]: I1202 21:00:38.271958 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:38 crc kubenswrapper[4807]: I1202 21:00:38.295967 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p4kqg"] Dec 02 21:00:38 crc kubenswrapper[4807]: I1202 21:00:38.322013 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbtsr\" (UniqueName: \"kubernetes.io/projected/8269fab7-aff6-4efc-9767-04da48ff1138-kube-api-access-xbtsr\") pod \"community-operators-p4kqg\" (UID: \"8269fab7-aff6-4efc-9767-04da48ff1138\") " pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:38 crc kubenswrapper[4807]: I1202 21:00:38.322252 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8269fab7-aff6-4efc-9767-04da48ff1138-catalog-content\") pod \"community-operators-p4kqg\" (UID: \"8269fab7-aff6-4efc-9767-04da48ff1138\") " pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:38 crc kubenswrapper[4807]: I1202 21:00:38.322356 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8269fab7-aff6-4efc-9767-04da48ff1138-utilities\") pod \"community-operators-p4kqg\" (UID: \"8269fab7-aff6-4efc-9767-04da48ff1138\") " pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:38 crc kubenswrapper[4807]: I1202 21:00:38.424852 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbtsr\" (UniqueName: \"kubernetes.io/projected/8269fab7-aff6-4efc-9767-04da48ff1138-kube-api-access-xbtsr\") pod \"community-operators-p4kqg\" (UID: \"8269fab7-aff6-4efc-9767-04da48ff1138\") " pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:38 crc kubenswrapper[4807]: I1202 21:00:38.425319 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8269fab7-aff6-4efc-9767-04da48ff1138-catalog-content\") pod \"community-operators-p4kqg\" (UID: \"8269fab7-aff6-4efc-9767-04da48ff1138\") " pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:38 crc kubenswrapper[4807]: I1202 21:00:38.425359 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8269fab7-aff6-4efc-9767-04da48ff1138-utilities\") pod \"community-operators-p4kqg\" (UID: \"8269fab7-aff6-4efc-9767-04da48ff1138\") " pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:38 crc kubenswrapper[4807]: I1202 21:00:38.425944 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8269fab7-aff6-4efc-9767-04da48ff1138-utilities\") pod \"community-operators-p4kqg\" (UID: \"8269fab7-aff6-4efc-9767-04da48ff1138\") " pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:38 crc kubenswrapper[4807]: I1202 21:00:38.426473 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8269fab7-aff6-4efc-9767-04da48ff1138-catalog-content\") pod \"community-operators-p4kqg\" (UID: \"8269fab7-aff6-4efc-9767-04da48ff1138\") " pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:38 crc kubenswrapper[4807]: I1202 21:00:38.458331 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbtsr\" (UniqueName: \"kubernetes.io/projected/8269fab7-aff6-4efc-9767-04da48ff1138-kube-api-access-xbtsr\") pod \"community-operators-p4kqg\" (UID: \"8269fab7-aff6-4efc-9767-04da48ff1138\") " pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:38 crc kubenswrapper[4807]: I1202 21:00:38.593464 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:39 crc kubenswrapper[4807]: I1202 21:00:39.111233 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p4kqg"] Dec 02 21:00:39 crc kubenswrapper[4807]: I1202 21:00:39.234814 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4kqg" event={"ID":"8269fab7-aff6-4efc-9767-04da48ff1138","Type":"ContainerStarted","Data":"981bbcdbe0de0097e84888862f3f275f4050787ce3a65dcfd28541a4747b5bd8"} Dec 02 21:00:40 crc kubenswrapper[4807]: I1202 21:00:40.246880 4807 generic.go:334] "Generic (PLEG): container finished" podID="8269fab7-aff6-4efc-9767-04da48ff1138" containerID="f892565ec3ade93084e0e9d472c0dd0b5a9eab08e3b42ec8a3a1131635772da5" exitCode=0 Dec 02 21:00:40 crc kubenswrapper[4807]: I1202 21:00:40.246949 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4kqg" event={"ID":"8269fab7-aff6-4efc-9767-04da48ff1138","Type":"ContainerDied","Data":"f892565ec3ade93084e0e9d472c0dd0b5a9eab08e3b42ec8a3a1131635772da5"} Dec 02 21:00:42 crc kubenswrapper[4807]: I1202 21:00:42.270345 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4kqg" event={"ID":"8269fab7-aff6-4efc-9767-04da48ff1138","Type":"ContainerStarted","Data":"2bd86b5bb91e3bd492c995de0c673a0fef39ad7e99e2ecdb8489d260b3cd102a"} Dec 02 21:00:43 crc kubenswrapper[4807]: I1202 21:00:43.280290 4807 generic.go:334] "Generic (PLEG): container finished" podID="8269fab7-aff6-4efc-9767-04da48ff1138" containerID="2bd86b5bb91e3bd492c995de0c673a0fef39ad7e99e2ecdb8489d260b3cd102a" exitCode=0 Dec 02 21:00:43 crc kubenswrapper[4807]: I1202 21:00:43.280419 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4kqg" event={"ID":"8269fab7-aff6-4efc-9767-04da48ff1138","Type":"ContainerDied","Data":"2bd86b5bb91e3bd492c995de0c673a0fef39ad7e99e2ecdb8489d260b3cd102a"} Dec 02 21:00:43 crc kubenswrapper[4807]: I1202 21:00:43.984470 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:00:43 crc kubenswrapper[4807]: E1202 21:00:43.985627 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:00:44 crc kubenswrapper[4807]: I1202 21:00:44.303571 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4kqg" event={"ID":"8269fab7-aff6-4efc-9767-04da48ff1138","Type":"ContainerStarted","Data":"c1db421eb45628d65628d78ecf4f1a3133c3d41cd48e9c2e8c9c1a8371a49b0b"} Dec 02 21:00:44 crc kubenswrapper[4807]: I1202 21:00:44.328610 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p4kqg" podStartSLOduration=2.666760408 podStartE2EDuration="6.328590431s" podCreationTimestamp="2025-12-02 21:00:38 +0000 UTC" firstStartedPulling="2025-12-02 21:00:40.249544634 +0000 UTC m=+3775.550452139" lastFinishedPulling="2025-12-02 21:00:43.911374627 +0000 UTC m=+3779.212282162" observedRunningTime="2025-12-02 21:00:44.32543622 +0000 UTC m=+3779.626343725" watchObservedRunningTime="2025-12-02 21:00:44.328590431 +0000 UTC m=+3779.629497946" Dec 02 21:00:48 crc kubenswrapper[4807]: I1202 21:00:48.593648 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:48 crc kubenswrapper[4807]: I1202 21:00:48.594401 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:48 crc kubenswrapper[4807]: I1202 21:00:48.644184 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:49 crc kubenswrapper[4807]: I1202 21:00:49.451015 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:49 crc kubenswrapper[4807]: I1202 21:00:49.503323 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p4kqg"] Dec 02 21:00:51 crc kubenswrapper[4807]: I1202 21:00:51.385544 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p4kqg" podUID="8269fab7-aff6-4efc-9767-04da48ff1138" containerName="registry-server" containerID="cri-o://c1db421eb45628d65628d78ecf4f1a3133c3d41cd48e9c2e8c9c1a8371a49b0b" gracePeriod=2 Dec 02 21:00:51 crc kubenswrapper[4807]: I1202 21:00:51.889702 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.045476 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8269fab7-aff6-4efc-9767-04da48ff1138-utilities\") pod \"8269fab7-aff6-4efc-9767-04da48ff1138\" (UID: \"8269fab7-aff6-4efc-9767-04da48ff1138\") " Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.045733 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8269fab7-aff6-4efc-9767-04da48ff1138-catalog-content\") pod \"8269fab7-aff6-4efc-9767-04da48ff1138\" (UID: \"8269fab7-aff6-4efc-9767-04da48ff1138\") " Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.045776 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbtsr\" (UniqueName: \"kubernetes.io/projected/8269fab7-aff6-4efc-9767-04da48ff1138-kube-api-access-xbtsr\") pod \"8269fab7-aff6-4efc-9767-04da48ff1138\" (UID: \"8269fab7-aff6-4efc-9767-04da48ff1138\") " Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.047710 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8269fab7-aff6-4efc-9767-04da48ff1138-utilities" (OuterVolumeSpecName: "utilities") pod "8269fab7-aff6-4efc-9767-04da48ff1138" (UID: "8269fab7-aff6-4efc-9767-04da48ff1138"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.053247 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8269fab7-aff6-4efc-9767-04da48ff1138-kube-api-access-xbtsr" (OuterVolumeSpecName: "kube-api-access-xbtsr") pod "8269fab7-aff6-4efc-9767-04da48ff1138" (UID: "8269fab7-aff6-4efc-9767-04da48ff1138"). InnerVolumeSpecName "kube-api-access-xbtsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.151365 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbtsr\" (UniqueName: \"kubernetes.io/projected/8269fab7-aff6-4efc-9767-04da48ff1138-kube-api-access-xbtsr\") on node \"crc\" DevicePath \"\"" Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.151429 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8269fab7-aff6-4efc-9767-04da48ff1138-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.398810 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8269fab7-aff6-4efc-9767-04da48ff1138-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8269fab7-aff6-4efc-9767-04da48ff1138" (UID: "8269fab7-aff6-4efc-9767-04da48ff1138"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.404021 4807 generic.go:334] "Generic (PLEG): container finished" podID="8269fab7-aff6-4efc-9767-04da48ff1138" containerID="c1db421eb45628d65628d78ecf4f1a3133c3d41cd48e9c2e8c9c1a8371a49b0b" exitCode=0 Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.404149 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4kqg" event={"ID":"8269fab7-aff6-4efc-9767-04da48ff1138","Type":"ContainerDied","Data":"c1db421eb45628d65628d78ecf4f1a3133c3d41cd48e9c2e8c9c1a8371a49b0b"} Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.404233 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4kqg" event={"ID":"8269fab7-aff6-4efc-9767-04da48ff1138","Type":"ContainerDied","Data":"981bbcdbe0de0097e84888862f3f275f4050787ce3a65dcfd28541a4747b5bd8"} Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.404268 4807 scope.go:117] "RemoveContainer" containerID="c1db421eb45628d65628d78ecf4f1a3133c3d41cd48e9c2e8c9c1a8371a49b0b" Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.404105 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p4kqg" Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.426297 4807 scope.go:117] "RemoveContainer" containerID="2bd86b5bb91e3bd492c995de0c673a0fef39ad7e99e2ecdb8489d260b3cd102a" Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.447983 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p4kqg"] Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.457499 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p4kqg"] Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.459069 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8269fab7-aff6-4efc-9767-04da48ff1138-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.459377 4807 scope.go:117] "RemoveContainer" containerID="f892565ec3ade93084e0e9d472c0dd0b5a9eab08e3b42ec8a3a1131635772da5" Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.510853 4807 scope.go:117] "RemoveContainer" containerID="c1db421eb45628d65628d78ecf4f1a3133c3d41cd48e9c2e8c9c1a8371a49b0b" Dec 02 21:00:52 crc kubenswrapper[4807]: E1202 21:00:52.511475 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1db421eb45628d65628d78ecf4f1a3133c3d41cd48e9c2e8c9c1a8371a49b0b\": container with ID starting with c1db421eb45628d65628d78ecf4f1a3133c3d41cd48e9c2e8c9c1a8371a49b0b not found: ID does not exist" containerID="c1db421eb45628d65628d78ecf4f1a3133c3d41cd48e9c2e8c9c1a8371a49b0b" Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.511517 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1db421eb45628d65628d78ecf4f1a3133c3d41cd48e9c2e8c9c1a8371a49b0b"} err="failed to get container status \"c1db421eb45628d65628d78ecf4f1a3133c3d41cd48e9c2e8c9c1a8371a49b0b\": rpc error: code = NotFound desc = could not find container \"c1db421eb45628d65628d78ecf4f1a3133c3d41cd48e9c2e8c9c1a8371a49b0b\": container with ID starting with c1db421eb45628d65628d78ecf4f1a3133c3d41cd48e9c2e8c9c1a8371a49b0b not found: ID does not exist" Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.511542 4807 scope.go:117] "RemoveContainer" containerID="2bd86b5bb91e3bd492c995de0c673a0fef39ad7e99e2ecdb8489d260b3cd102a" Dec 02 21:00:52 crc kubenswrapper[4807]: E1202 21:00:52.512051 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bd86b5bb91e3bd492c995de0c673a0fef39ad7e99e2ecdb8489d260b3cd102a\": container with ID starting with 2bd86b5bb91e3bd492c995de0c673a0fef39ad7e99e2ecdb8489d260b3cd102a not found: ID does not exist" containerID="2bd86b5bb91e3bd492c995de0c673a0fef39ad7e99e2ecdb8489d260b3cd102a" Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.512491 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bd86b5bb91e3bd492c995de0c673a0fef39ad7e99e2ecdb8489d260b3cd102a"} err="failed to get container status \"2bd86b5bb91e3bd492c995de0c673a0fef39ad7e99e2ecdb8489d260b3cd102a\": rpc error: code = NotFound desc = could not find container \"2bd86b5bb91e3bd492c995de0c673a0fef39ad7e99e2ecdb8489d260b3cd102a\": container with ID starting with 2bd86b5bb91e3bd492c995de0c673a0fef39ad7e99e2ecdb8489d260b3cd102a not found: ID does not exist" Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.512603 4807 scope.go:117] "RemoveContainer" containerID="f892565ec3ade93084e0e9d472c0dd0b5a9eab08e3b42ec8a3a1131635772da5" Dec 02 21:00:52 crc kubenswrapper[4807]: E1202 21:00:52.513063 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f892565ec3ade93084e0e9d472c0dd0b5a9eab08e3b42ec8a3a1131635772da5\": container with ID starting with f892565ec3ade93084e0e9d472c0dd0b5a9eab08e3b42ec8a3a1131635772da5 not found: ID does not exist" containerID="f892565ec3ade93084e0e9d472c0dd0b5a9eab08e3b42ec8a3a1131635772da5" Dec 02 21:00:52 crc kubenswrapper[4807]: I1202 21:00:52.513191 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f892565ec3ade93084e0e9d472c0dd0b5a9eab08e3b42ec8a3a1131635772da5"} err="failed to get container status \"f892565ec3ade93084e0e9d472c0dd0b5a9eab08e3b42ec8a3a1131635772da5\": rpc error: code = NotFound desc = could not find container \"f892565ec3ade93084e0e9d472c0dd0b5a9eab08e3b42ec8a3a1131635772da5\": container with ID starting with f892565ec3ade93084e0e9d472c0dd0b5a9eab08e3b42ec8a3a1131635772da5 not found: ID does not exist" Dec 02 21:00:53 crc kubenswrapper[4807]: I1202 21:00:53.019545 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8269fab7-aff6-4efc-9767-04da48ff1138" path="/var/lib/kubelet/pods/8269fab7-aff6-4efc-9767-04da48ff1138/volumes" Dec 02 21:00:55 crc kubenswrapper[4807]: I1202 21:00:55.972930 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:00:55 crc kubenswrapper[4807]: E1202 21:00:55.973625 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.163126 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29411821-qfgdj"] Dec 02 21:01:00 crc kubenswrapper[4807]: E1202 21:01:00.164257 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8269fab7-aff6-4efc-9767-04da48ff1138" containerName="registry-server" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.164277 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="8269fab7-aff6-4efc-9767-04da48ff1138" containerName="registry-server" Dec 02 21:01:00 crc kubenswrapper[4807]: E1202 21:01:00.164331 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8269fab7-aff6-4efc-9767-04da48ff1138" containerName="extract-content" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.164350 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="8269fab7-aff6-4efc-9767-04da48ff1138" containerName="extract-content" Dec 02 21:01:00 crc kubenswrapper[4807]: E1202 21:01:00.164381 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8269fab7-aff6-4efc-9767-04da48ff1138" containerName="extract-utilities" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.164409 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="8269fab7-aff6-4efc-9767-04da48ff1138" containerName="extract-utilities" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.164756 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="8269fab7-aff6-4efc-9767-04da48ff1138" containerName="registry-server" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.165612 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411821-qfgdj" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.180600 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29411821-qfgdj"] Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.334797 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh9t8\" (UniqueName: \"kubernetes.io/projected/1f152d73-b7a0-4142-8f65-2343fca9dc2e-kube-api-access-wh9t8\") pod \"keystone-cron-29411821-qfgdj\" (UID: \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\") " pod="openstack/keystone-cron-29411821-qfgdj" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.334867 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-fernet-keys\") pod \"keystone-cron-29411821-qfgdj\" (UID: \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\") " pod="openstack/keystone-cron-29411821-qfgdj" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.335444 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-combined-ca-bundle\") pod \"keystone-cron-29411821-qfgdj\" (UID: \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\") " pod="openstack/keystone-cron-29411821-qfgdj" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.335595 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-config-data\") pod \"keystone-cron-29411821-qfgdj\" (UID: \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\") " pod="openstack/keystone-cron-29411821-qfgdj" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.437292 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh9t8\" (UniqueName: \"kubernetes.io/projected/1f152d73-b7a0-4142-8f65-2343fca9dc2e-kube-api-access-wh9t8\") pod \"keystone-cron-29411821-qfgdj\" (UID: \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\") " pod="openstack/keystone-cron-29411821-qfgdj" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.437369 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-fernet-keys\") pod \"keystone-cron-29411821-qfgdj\" (UID: \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\") " pod="openstack/keystone-cron-29411821-qfgdj" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.437475 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-combined-ca-bundle\") pod \"keystone-cron-29411821-qfgdj\" (UID: \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\") " pod="openstack/keystone-cron-29411821-qfgdj" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.437526 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-config-data\") pod \"keystone-cron-29411821-qfgdj\" (UID: \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\") " pod="openstack/keystone-cron-29411821-qfgdj" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.444316 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-fernet-keys\") pod \"keystone-cron-29411821-qfgdj\" (UID: \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\") " pod="openstack/keystone-cron-29411821-qfgdj" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.444849 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-combined-ca-bundle\") pod \"keystone-cron-29411821-qfgdj\" (UID: \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\") " pod="openstack/keystone-cron-29411821-qfgdj" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.449026 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-config-data\") pod \"keystone-cron-29411821-qfgdj\" (UID: \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\") " pod="openstack/keystone-cron-29411821-qfgdj" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.469910 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh9t8\" (UniqueName: \"kubernetes.io/projected/1f152d73-b7a0-4142-8f65-2343fca9dc2e-kube-api-access-wh9t8\") pod \"keystone-cron-29411821-qfgdj\" (UID: \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\") " pod="openstack/keystone-cron-29411821-qfgdj" Dec 02 21:01:00 crc kubenswrapper[4807]: I1202 21:01:00.496570 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411821-qfgdj" Dec 02 21:01:01 crc kubenswrapper[4807]: I1202 21:01:01.031005 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29411821-qfgdj"] Dec 02 21:01:01 crc kubenswrapper[4807]: I1202 21:01:01.516081 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411821-qfgdj" event={"ID":"1f152d73-b7a0-4142-8f65-2343fca9dc2e","Type":"ContainerStarted","Data":"e1095e32337db4ecd3e372c15f80c39ebf11b4919af57e56cdc12fa18a24181c"} Dec 02 21:01:01 crc kubenswrapper[4807]: I1202 21:01:01.517259 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411821-qfgdj" event={"ID":"1f152d73-b7a0-4142-8f65-2343fca9dc2e","Type":"ContainerStarted","Data":"3b2adf8697917aa1c60a8fa4a62dad22521ab507a6b939877fa4fcf5617fd9ae"} Dec 02 21:01:01 crc kubenswrapper[4807]: I1202 21:01:01.541699 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29411821-qfgdj" podStartSLOduration=1.54167299 podStartE2EDuration="1.54167299s" podCreationTimestamp="2025-12-02 21:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 21:01:01.537359825 +0000 UTC m=+3796.838267320" watchObservedRunningTime="2025-12-02 21:01:01.54167299 +0000 UTC m=+3796.842580485" Dec 02 21:01:03 crc kubenswrapper[4807]: I1202 21:01:03.549749 4807 generic.go:334] "Generic (PLEG): container finished" podID="1f152d73-b7a0-4142-8f65-2343fca9dc2e" containerID="e1095e32337db4ecd3e372c15f80c39ebf11b4919af57e56cdc12fa18a24181c" exitCode=0 Dec 02 21:01:03 crc kubenswrapper[4807]: I1202 21:01:03.549908 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411821-qfgdj" event={"ID":"1f152d73-b7a0-4142-8f65-2343fca9dc2e","Type":"ContainerDied","Data":"e1095e32337db4ecd3e372c15f80c39ebf11b4919af57e56cdc12fa18a24181c"} Dec 02 21:01:05 crc kubenswrapper[4807]: I1202 21:01:05.021410 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411821-qfgdj" Dec 02 21:01:05 crc kubenswrapper[4807]: I1202 21:01:05.141671 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-config-data\") pod \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\" (UID: \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\") " Dec 02 21:01:05 crc kubenswrapper[4807]: I1202 21:01:05.142058 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-combined-ca-bundle\") pod \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\" (UID: \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\") " Dec 02 21:01:05 crc kubenswrapper[4807]: I1202 21:01:05.142168 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh9t8\" (UniqueName: \"kubernetes.io/projected/1f152d73-b7a0-4142-8f65-2343fca9dc2e-kube-api-access-wh9t8\") pod \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\" (UID: \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\") " Dec 02 21:01:05 crc kubenswrapper[4807]: I1202 21:01:05.142261 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-fernet-keys\") pod \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\" (UID: \"1f152d73-b7a0-4142-8f65-2343fca9dc2e\") " Dec 02 21:01:05 crc kubenswrapper[4807]: I1202 21:01:05.147243 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1f152d73-b7a0-4142-8f65-2343fca9dc2e" (UID: "1f152d73-b7a0-4142-8f65-2343fca9dc2e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 21:01:05 crc kubenswrapper[4807]: I1202 21:01:05.147504 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f152d73-b7a0-4142-8f65-2343fca9dc2e-kube-api-access-wh9t8" (OuterVolumeSpecName: "kube-api-access-wh9t8") pod "1f152d73-b7a0-4142-8f65-2343fca9dc2e" (UID: "1f152d73-b7a0-4142-8f65-2343fca9dc2e"). InnerVolumeSpecName "kube-api-access-wh9t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:01:05 crc kubenswrapper[4807]: I1202 21:01:05.171841 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f152d73-b7a0-4142-8f65-2343fca9dc2e" (UID: "1f152d73-b7a0-4142-8f65-2343fca9dc2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 21:01:05 crc kubenswrapper[4807]: I1202 21:01:05.200311 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-config-data" (OuterVolumeSpecName: "config-data") pod "1f152d73-b7a0-4142-8f65-2343fca9dc2e" (UID: "1f152d73-b7a0-4142-8f65-2343fca9dc2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 21:01:05 crc kubenswrapper[4807]: I1202 21:01:05.244884 4807 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 21:01:05 crc kubenswrapper[4807]: I1202 21:01:05.244917 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 21:01:05 crc kubenswrapper[4807]: I1202 21:01:05.244926 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f152d73-b7a0-4142-8f65-2343fca9dc2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 21:01:05 crc kubenswrapper[4807]: I1202 21:01:05.244937 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh9t8\" (UniqueName: \"kubernetes.io/projected/1f152d73-b7a0-4142-8f65-2343fca9dc2e-kube-api-access-wh9t8\") on node \"crc\" DevicePath \"\"" Dec 02 21:01:05 crc kubenswrapper[4807]: I1202 21:01:05.572891 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411821-qfgdj" event={"ID":"1f152d73-b7a0-4142-8f65-2343fca9dc2e","Type":"ContainerDied","Data":"3b2adf8697917aa1c60a8fa4a62dad22521ab507a6b939877fa4fcf5617fd9ae"} Dec 02 21:01:05 crc kubenswrapper[4807]: I1202 21:01:05.572932 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b2adf8697917aa1c60a8fa4a62dad22521ab507a6b939877fa4fcf5617fd9ae" Dec 02 21:01:05 crc kubenswrapper[4807]: I1202 21:01:05.572991 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411821-qfgdj" Dec 02 21:01:10 crc kubenswrapper[4807]: I1202 21:01:10.974099 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:01:10 crc kubenswrapper[4807]: E1202 21:01:10.975086 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:01:25 crc kubenswrapper[4807]: I1202 21:01:25.973001 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:01:25 crc kubenswrapper[4807]: E1202 21:01:25.973920 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:01:40 crc kubenswrapper[4807]: I1202 21:01:40.973496 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:01:40 crc kubenswrapper[4807]: E1202 21:01:40.974712 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:01:55 crc kubenswrapper[4807]: I1202 21:01:55.972929 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:01:55 crc kubenswrapper[4807]: E1202 21:01:55.975273 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:02:08 crc kubenswrapper[4807]: I1202 21:02:08.973655 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:02:08 crc kubenswrapper[4807]: E1202 21:02:08.974508 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:02:23 crc kubenswrapper[4807]: I1202 21:02:23.974082 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:02:23 crc kubenswrapper[4807]: E1202 21:02:23.975404 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:02:38 crc kubenswrapper[4807]: I1202 21:02:38.973010 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:02:38 crc kubenswrapper[4807]: E1202 21:02:38.974345 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:02:50 crc kubenswrapper[4807]: I1202 21:02:50.973770 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:02:50 crc kubenswrapper[4807]: E1202 21:02:50.974542 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:03:03 crc kubenswrapper[4807]: I1202 21:03:03.973121 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:03:03 crc kubenswrapper[4807]: E1202 21:03:03.973950 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:03:17 crc kubenswrapper[4807]: I1202 21:03:17.971924 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:03:17 crc kubenswrapper[4807]: E1202 21:03:17.972533 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:03:30 crc kubenswrapper[4807]: I1202 21:03:30.975295 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:03:30 crc kubenswrapper[4807]: E1202 21:03:30.976378 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:03:43 crc kubenswrapper[4807]: I1202 21:03:43.972691 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:03:43 crc kubenswrapper[4807]: E1202 21:03:43.973356 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:03:56 crc kubenswrapper[4807]: I1202 21:03:56.972808 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:03:56 crc kubenswrapper[4807]: E1202 21:03:56.973689 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:04:11 crc kubenswrapper[4807]: I1202 21:04:11.974035 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:04:12 crc kubenswrapper[4807]: I1202 21:04:12.893973 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"375e1fadd6b44094984c3dfa802ca4b373ed316a82df44a301888e8f70285350"} Dec 02 21:06:28 crc kubenswrapper[4807]: I1202 21:06:28.292751 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:06:28 crc kubenswrapper[4807]: I1202 21:06:28.293507 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:06:58 crc kubenswrapper[4807]: I1202 21:06:58.293518 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:06:58 crc kubenswrapper[4807]: I1202 21:06:58.294202 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:07:07 crc kubenswrapper[4807]: I1202 21:07:07.785027 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="df8d83f3-6675-416b-a039-2aafac45fe18" containerName="galera" probeResult="failure" output="command timed out" Dec 02 21:07:28 crc kubenswrapper[4807]: I1202 21:07:28.293473 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:07:28 crc kubenswrapper[4807]: I1202 21:07:28.294190 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:07:28 crc kubenswrapper[4807]: I1202 21:07:28.294310 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 21:07:28 crc kubenswrapper[4807]: I1202 21:07:28.295419 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"375e1fadd6b44094984c3dfa802ca4b373ed316a82df44a301888e8f70285350"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 21:07:28 crc kubenswrapper[4807]: I1202 21:07:28.295554 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://375e1fadd6b44094984c3dfa802ca4b373ed316a82df44a301888e8f70285350" gracePeriod=600 Dec 02 21:07:28 crc kubenswrapper[4807]: I1202 21:07:28.476398 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="375e1fadd6b44094984c3dfa802ca4b373ed316a82df44a301888e8f70285350" exitCode=0 Dec 02 21:07:28 crc kubenswrapper[4807]: I1202 21:07:28.476456 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"375e1fadd6b44094984c3dfa802ca4b373ed316a82df44a301888e8f70285350"} Dec 02 21:07:28 crc kubenswrapper[4807]: I1202 21:07:28.476498 4807 scope.go:117] "RemoveContainer" containerID="85e020926e595aeb3fec44f2646fe7a035a92d07a7008c8309cc8839bb2b14a0" Dec 02 21:07:29 crc kubenswrapper[4807]: I1202 21:07:29.511999 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869"} Dec 02 21:07:43 crc kubenswrapper[4807]: I1202 21:07:43.418830 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vd6mt"] Dec 02 21:07:43 crc kubenswrapper[4807]: E1202 21:07:43.419738 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f152d73-b7a0-4142-8f65-2343fca9dc2e" containerName="keystone-cron" Dec 02 21:07:43 crc kubenswrapper[4807]: I1202 21:07:43.419754 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f152d73-b7a0-4142-8f65-2343fca9dc2e" containerName="keystone-cron" Dec 02 21:07:43 crc kubenswrapper[4807]: I1202 21:07:43.420068 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f152d73-b7a0-4142-8f65-2343fca9dc2e" containerName="keystone-cron" Dec 02 21:07:43 crc kubenswrapper[4807]: I1202 21:07:43.421984 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vd6mt" Dec 02 21:07:43 crc kubenswrapper[4807]: I1202 21:07:43.430454 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vd6mt"] Dec 02 21:07:43 crc kubenswrapper[4807]: I1202 21:07:43.513165 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9dll\" (UniqueName: \"kubernetes.io/projected/13abe6c3-0a67-491f-abce-fc06c82b4707-kube-api-access-m9dll\") pod \"redhat-operators-vd6mt\" (UID: \"13abe6c3-0a67-491f-abce-fc06c82b4707\") " pod="openshift-marketplace/redhat-operators-vd6mt" Dec 02 21:07:43 crc kubenswrapper[4807]: I1202 21:07:43.513264 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13abe6c3-0a67-491f-abce-fc06c82b4707-catalog-content\") pod \"redhat-operators-vd6mt\" (UID: \"13abe6c3-0a67-491f-abce-fc06c82b4707\") " pod="openshift-marketplace/redhat-operators-vd6mt" Dec 02 21:07:43 crc kubenswrapper[4807]: I1202 21:07:43.513424 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13abe6c3-0a67-491f-abce-fc06c82b4707-utilities\") pod \"redhat-operators-vd6mt\" (UID: \"13abe6c3-0a67-491f-abce-fc06c82b4707\") " pod="openshift-marketplace/redhat-operators-vd6mt" Dec 02 21:07:43 crc kubenswrapper[4807]: I1202 21:07:43.615905 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9dll\" (UniqueName: \"kubernetes.io/projected/13abe6c3-0a67-491f-abce-fc06c82b4707-kube-api-access-m9dll\") pod \"redhat-operators-vd6mt\" (UID: \"13abe6c3-0a67-491f-abce-fc06c82b4707\") " pod="openshift-marketplace/redhat-operators-vd6mt" Dec 02 21:07:43 crc kubenswrapper[4807]: I1202 21:07:43.616243 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13abe6c3-0a67-491f-abce-fc06c82b4707-catalog-content\") pod \"redhat-operators-vd6mt\" (UID: \"13abe6c3-0a67-491f-abce-fc06c82b4707\") " pod="openshift-marketplace/redhat-operators-vd6mt" Dec 02 21:07:43 crc kubenswrapper[4807]: I1202 21:07:43.616295 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13abe6c3-0a67-491f-abce-fc06c82b4707-utilities\") pod \"redhat-operators-vd6mt\" (UID: \"13abe6c3-0a67-491f-abce-fc06c82b4707\") " pod="openshift-marketplace/redhat-operators-vd6mt" Dec 02 21:07:43 crc kubenswrapper[4807]: I1202 21:07:43.616820 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13abe6c3-0a67-491f-abce-fc06c82b4707-utilities\") pod \"redhat-operators-vd6mt\" (UID: \"13abe6c3-0a67-491f-abce-fc06c82b4707\") " pod="openshift-marketplace/redhat-operators-vd6mt" Dec 02 21:07:43 crc kubenswrapper[4807]: I1202 21:07:43.616825 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13abe6c3-0a67-491f-abce-fc06c82b4707-catalog-content\") pod \"redhat-operators-vd6mt\" (UID: \"13abe6c3-0a67-491f-abce-fc06c82b4707\") " pod="openshift-marketplace/redhat-operators-vd6mt" Dec 02 21:07:43 crc kubenswrapper[4807]: I1202 21:07:43.639985 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9dll\" (UniqueName: \"kubernetes.io/projected/13abe6c3-0a67-491f-abce-fc06c82b4707-kube-api-access-m9dll\") pod \"redhat-operators-vd6mt\" (UID: \"13abe6c3-0a67-491f-abce-fc06c82b4707\") " pod="openshift-marketplace/redhat-operators-vd6mt" Dec 02 21:07:43 crc kubenswrapper[4807]: I1202 21:07:43.758998 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vd6mt" Dec 02 21:07:44 crc kubenswrapper[4807]: I1202 21:07:44.738560 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vd6mt"] Dec 02 21:07:45 crc kubenswrapper[4807]: I1202 21:07:45.750009 4807 generic.go:334] "Generic (PLEG): container finished" podID="13abe6c3-0a67-491f-abce-fc06c82b4707" containerID="fcb80cdffacc0f8c182a0ead020011447fc926b51ddd18be5a4d54fce3b47427" exitCode=0 Dec 02 21:07:45 crc kubenswrapper[4807]: I1202 21:07:45.750133 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vd6mt" event={"ID":"13abe6c3-0a67-491f-abce-fc06c82b4707","Type":"ContainerDied","Data":"fcb80cdffacc0f8c182a0ead020011447fc926b51ddd18be5a4d54fce3b47427"} Dec 02 21:07:45 crc kubenswrapper[4807]: I1202 21:07:45.750870 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vd6mt" event={"ID":"13abe6c3-0a67-491f-abce-fc06c82b4707","Type":"ContainerStarted","Data":"3cfcbae758aa0839f7d38375eddc3546ea0604d5cb4ddf1f3f789d692f2d6130"} Dec 02 21:07:45 crc kubenswrapper[4807]: I1202 21:07:45.758531 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 21:07:47 crc kubenswrapper[4807]: I1202 21:07:47.609814 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gwkzk"] Dec 02 21:07:47 crc kubenswrapper[4807]: I1202 21:07:47.614402 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:07:47 crc kubenswrapper[4807]: I1202 21:07:47.642143 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwkzk"] Dec 02 21:07:47 crc kubenswrapper[4807]: I1202 21:07:47.707239 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f89769ab-209d-4ab3-9759-a0f26b9400f0-catalog-content\") pod \"redhat-marketplace-gwkzk\" (UID: \"f89769ab-209d-4ab3-9759-a0f26b9400f0\") " pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:07:47 crc kubenswrapper[4807]: I1202 21:07:47.707633 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f89769ab-209d-4ab3-9759-a0f26b9400f0-utilities\") pod \"redhat-marketplace-gwkzk\" (UID: \"f89769ab-209d-4ab3-9759-a0f26b9400f0\") " pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:07:47 crc kubenswrapper[4807]: I1202 21:07:47.707675 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c4z6\" (UniqueName: \"kubernetes.io/projected/f89769ab-209d-4ab3-9759-a0f26b9400f0-kube-api-access-6c4z6\") pod \"redhat-marketplace-gwkzk\" (UID: \"f89769ab-209d-4ab3-9759-a0f26b9400f0\") " pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:07:47 crc kubenswrapper[4807]: I1202 21:07:47.809940 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f89769ab-209d-4ab3-9759-a0f26b9400f0-catalog-content\") pod \"redhat-marketplace-gwkzk\" (UID: \"f89769ab-209d-4ab3-9759-a0f26b9400f0\") " pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:07:47 crc kubenswrapper[4807]: I1202 21:07:47.809999 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f89769ab-209d-4ab3-9759-a0f26b9400f0-utilities\") pod \"redhat-marketplace-gwkzk\" (UID: \"f89769ab-209d-4ab3-9759-a0f26b9400f0\") " pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:07:47 crc kubenswrapper[4807]: I1202 21:07:47.810035 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c4z6\" (UniqueName: \"kubernetes.io/projected/f89769ab-209d-4ab3-9759-a0f26b9400f0-kube-api-access-6c4z6\") pod \"redhat-marketplace-gwkzk\" (UID: \"f89769ab-209d-4ab3-9759-a0f26b9400f0\") " pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:07:47 crc kubenswrapper[4807]: I1202 21:07:47.810746 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f89769ab-209d-4ab3-9759-a0f26b9400f0-catalog-content\") pod \"redhat-marketplace-gwkzk\" (UID: \"f89769ab-209d-4ab3-9759-a0f26b9400f0\") " pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:07:47 crc kubenswrapper[4807]: I1202 21:07:47.810872 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f89769ab-209d-4ab3-9759-a0f26b9400f0-utilities\") pod \"redhat-marketplace-gwkzk\" (UID: \"f89769ab-209d-4ab3-9759-a0f26b9400f0\") " pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:07:47 crc kubenswrapper[4807]: I1202 21:07:47.830782 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c4z6\" (UniqueName: \"kubernetes.io/projected/f89769ab-209d-4ab3-9759-a0f26b9400f0-kube-api-access-6c4z6\") pod \"redhat-marketplace-gwkzk\" (UID: \"f89769ab-209d-4ab3-9759-a0f26b9400f0\") " pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:07:47 crc kubenswrapper[4807]: I1202 21:07:47.949399 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:07:48 crc kubenswrapper[4807]: I1202 21:07:48.433039 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwkzk"] Dec 02 21:07:48 crc kubenswrapper[4807]: I1202 21:07:48.783245 4807 generic.go:334] "Generic (PLEG): container finished" podID="f89769ab-209d-4ab3-9759-a0f26b9400f0" containerID="0fbb9fb15e22ef002feedbf38618f49cd5bb2f4601fee6c596d081538734f0f8" exitCode=0 Dec 02 21:07:48 crc kubenswrapper[4807]: I1202 21:07:48.783382 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwkzk" event={"ID":"f89769ab-209d-4ab3-9759-a0f26b9400f0","Type":"ContainerDied","Data":"0fbb9fb15e22ef002feedbf38618f49cd5bb2f4601fee6c596d081538734f0f8"} Dec 02 21:07:48 crc kubenswrapper[4807]: I1202 21:07:48.783597 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwkzk" event={"ID":"f89769ab-209d-4ab3-9759-a0f26b9400f0","Type":"ContainerStarted","Data":"d19842cdb56661e628e587e4cf815e12afd449f038871acbcf6210aab54c8577"} Dec 02 21:07:49 crc kubenswrapper[4807]: I1202 21:07:49.796464 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwkzk" event={"ID":"f89769ab-209d-4ab3-9759-a0f26b9400f0","Type":"ContainerStarted","Data":"be9cc4aa1b029e4c5e0e16dd7042fc4aaabc561d6dea386696fb51e4f70b3009"} Dec 02 21:07:50 crc kubenswrapper[4807]: I1202 21:07:50.820008 4807 generic.go:334] "Generic (PLEG): container finished" podID="f89769ab-209d-4ab3-9759-a0f26b9400f0" containerID="be9cc4aa1b029e4c5e0e16dd7042fc4aaabc561d6dea386696fb51e4f70b3009" exitCode=0 Dec 02 21:07:50 crc kubenswrapper[4807]: I1202 21:07:50.820147 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwkzk" event={"ID":"f89769ab-209d-4ab3-9759-a0f26b9400f0","Type":"ContainerDied","Data":"be9cc4aa1b029e4c5e0e16dd7042fc4aaabc561d6dea386696fb51e4f70b3009"} Dec 02 21:07:55 crc kubenswrapper[4807]: I1202 21:07:55.877916 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwkzk" event={"ID":"f89769ab-209d-4ab3-9759-a0f26b9400f0","Type":"ContainerStarted","Data":"bf39ab0a2c16c04161421acb2e2406ee69409e47a0856d4377c23ef9687fcc31"} Dec 02 21:07:55 crc kubenswrapper[4807]: I1202 21:07:55.886208 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vd6mt" event={"ID":"13abe6c3-0a67-491f-abce-fc06c82b4707","Type":"ContainerStarted","Data":"3d4a57e70567a6ae06f6309ec79cd21bbb300fa4890bd610841372b567470584"} Dec 02 21:07:55 crc kubenswrapper[4807]: I1202 21:07:55.914196 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gwkzk" podStartSLOduration=2.116166547 podStartE2EDuration="8.914169662s" podCreationTimestamp="2025-12-02 21:07:47 +0000 UTC" firstStartedPulling="2025-12-02 21:07:48.785851829 +0000 UTC m=+4204.086759314" lastFinishedPulling="2025-12-02 21:07:55.583854924 +0000 UTC m=+4210.884762429" observedRunningTime="2025-12-02 21:07:55.902913663 +0000 UTC m=+4211.203821198" watchObservedRunningTime="2025-12-02 21:07:55.914169662 +0000 UTC m=+4211.215077167" Dec 02 21:07:57 crc kubenswrapper[4807]: I1202 21:07:57.915393 4807 generic.go:334] "Generic (PLEG): container finished" podID="13abe6c3-0a67-491f-abce-fc06c82b4707" containerID="3d4a57e70567a6ae06f6309ec79cd21bbb300fa4890bd610841372b567470584" exitCode=0 Dec 02 21:07:57 crc kubenswrapper[4807]: I1202 21:07:57.915453 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vd6mt" event={"ID":"13abe6c3-0a67-491f-abce-fc06c82b4707","Type":"ContainerDied","Data":"3d4a57e70567a6ae06f6309ec79cd21bbb300fa4890bd610841372b567470584"} Dec 02 21:07:57 crc kubenswrapper[4807]: I1202 21:07:57.951627 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:07:57 crc kubenswrapper[4807]: I1202 21:07:57.951699 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:07:59 crc kubenswrapper[4807]: I1202 21:07:59.025515 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-gwkzk" podUID="f89769ab-209d-4ab3-9759-a0f26b9400f0" containerName="registry-server" probeResult="failure" output=< Dec 02 21:07:59 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Dec 02 21:07:59 crc kubenswrapper[4807]: > Dec 02 21:07:59 crc kubenswrapper[4807]: I1202 21:07:59.938550 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vd6mt" event={"ID":"13abe6c3-0a67-491f-abce-fc06c82b4707","Type":"ContainerStarted","Data":"3c6a280fbfc7dd15a426a2b460fa174bf5df4bcd8929a800e679c28470fc3cf5"} Dec 02 21:07:59 crc kubenswrapper[4807]: I1202 21:07:59.964922 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vd6mt" podStartSLOduration=3.992826709 podStartE2EDuration="16.964894285s" podCreationTimestamp="2025-12-02 21:07:43 +0000 UTC" firstStartedPulling="2025-12-02 21:07:45.754114474 +0000 UTC m=+4201.055022009" lastFinishedPulling="2025-12-02 21:07:58.72618205 +0000 UTC m=+4214.027089585" observedRunningTime="2025-12-02 21:07:59.956766155 +0000 UTC m=+4215.257673660" watchObservedRunningTime="2025-12-02 21:07:59.964894285 +0000 UTC m=+4215.265801820" Dec 02 21:08:03 crc kubenswrapper[4807]: I1202 21:08:03.759419 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vd6mt" Dec 02 21:08:03 crc kubenswrapper[4807]: I1202 21:08:03.759967 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vd6mt" Dec 02 21:08:04 crc kubenswrapper[4807]: I1202 21:08:04.820050 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vd6mt" podUID="13abe6c3-0a67-491f-abce-fc06c82b4707" containerName="registry-server" probeResult="failure" output=< Dec 02 21:08:04 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Dec 02 21:08:04 crc kubenswrapper[4807]: > Dec 02 21:08:08 crc kubenswrapper[4807]: I1202 21:08:08.024086 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:08:08 crc kubenswrapper[4807]: I1202 21:08:08.086879 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:08:08 crc kubenswrapper[4807]: I1202 21:08:08.278234 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwkzk"] Dec 02 21:08:10 crc kubenswrapper[4807]: I1202 21:08:10.054943 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gwkzk" podUID="f89769ab-209d-4ab3-9759-a0f26b9400f0" containerName="registry-server" containerID="cri-o://bf39ab0a2c16c04161421acb2e2406ee69409e47a0856d4377c23ef9687fcc31" gracePeriod=2 Dec 02 21:08:10 crc kubenswrapper[4807]: I1202 21:08:10.600649 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:08:10 crc kubenswrapper[4807]: I1202 21:08:10.649675 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f89769ab-209d-4ab3-9759-a0f26b9400f0-catalog-content\") pod \"f89769ab-209d-4ab3-9759-a0f26b9400f0\" (UID: \"f89769ab-209d-4ab3-9759-a0f26b9400f0\") " Dec 02 21:08:10 crc kubenswrapper[4807]: I1202 21:08:10.649794 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f89769ab-209d-4ab3-9759-a0f26b9400f0-utilities\") pod \"f89769ab-209d-4ab3-9759-a0f26b9400f0\" (UID: \"f89769ab-209d-4ab3-9759-a0f26b9400f0\") " Dec 02 21:08:10 crc kubenswrapper[4807]: I1202 21:08:10.649910 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c4z6\" (UniqueName: \"kubernetes.io/projected/f89769ab-209d-4ab3-9759-a0f26b9400f0-kube-api-access-6c4z6\") pod \"f89769ab-209d-4ab3-9759-a0f26b9400f0\" (UID: \"f89769ab-209d-4ab3-9759-a0f26b9400f0\") " Dec 02 21:08:10 crc kubenswrapper[4807]: I1202 21:08:10.650509 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f89769ab-209d-4ab3-9759-a0f26b9400f0-utilities" (OuterVolumeSpecName: "utilities") pod "f89769ab-209d-4ab3-9759-a0f26b9400f0" (UID: "f89769ab-209d-4ab3-9759-a0f26b9400f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:08:10 crc kubenswrapper[4807]: I1202 21:08:10.655958 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f89769ab-209d-4ab3-9759-a0f26b9400f0-kube-api-access-6c4z6" (OuterVolumeSpecName: "kube-api-access-6c4z6") pod "f89769ab-209d-4ab3-9759-a0f26b9400f0" (UID: "f89769ab-209d-4ab3-9759-a0f26b9400f0"). InnerVolumeSpecName "kube-api-access-6c4z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:08:10 crc kubenswrapper[4807]: I1202 21:08:10.669991 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f89769ab-209d-4ab3-9759-a0f26b9400f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f89769ab-209d-4ab3-9759-a0f26b9400f0" (UID: "f89769ab-209d-4ab3-9759-a0f26b9400f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:08:10 crc kubenswrapper[4807]: I1202 21:08:10.753266 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c4z6\" (UniqueName: \"kubernetes.io/projected/f89769ab-209d-4ab3-9759-a0f26b9400f0-kube-api-access-6c4z6\") on node \"crc\" DevicePath \"\"" Dec 02 21:08:10 crc kubenswrapper[4807]: I1202 21:08:10.753295 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f89769ab-209d-4ab3-9759-a0f26b9400f0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 21:08:10 crc kubenswrapper[4807]: I1202 21:08:10.753305 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f89769ab-209d-4ab3-9759-a0f26b9400f0-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 21:08:11 crc kubenswrapper[4807]: I1202 21:08:11.067962 4807 generic.go:334] "Generic (PLEG): container finished" podID="f89769ab-209d-4ab3-9759-a0f26b9400f0" containerID="bf39ab0a2c16c04161421acb2e2406ee69409e47a0856d4377c23ef9687fcc31" exitCode=0 Dec 02 21:08:11 crc kubenswrapper[4807]: I1202 21:08:11.068007 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwkzk" event={"ID":"f89769ab-209d-4ab3-9759-a0f26b9400f0","Type":"ContainerDied","Data":"bf39ab0a2c16c04161421acb2e2406ee69409e47a0856d4377c23ef9687fcc31"} Dec 02 21:08:11 crc kubenswrapper[4807]: I1202 21:08:11.068035 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gwkzk" event={"ID":"f89769ab-209d-4ab3-9759-a0f26b9400f0","Type":"ContainerDied","Data":"d19842cdb56661e628e587e4cf815e12afd449f038871acbcf6210aab54c8577"} Dec 02 21:08:11 crc kubenswrapper[4807]: I1202 21:08:11.068053 4807 scope.go:117] "RemoveContainer" containerID="bf39ab0a2c16c04161421acb2e2406ee69409e47a0856d4377c23ef9687fcc31" Dec 02 21:08:11 crc kubenswrapper[4807]: I1202 21:08:11.068134 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gwkzk" Dec 02 21:08:11 crc kubenswrapper[4807]: I1202 21:08:11.102191 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwkzk"] Dec 02 21:08:11 crc kubenswrapper[4807]: I1202 21:08:11.115147 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gwkzk"] Dec 02 21:08:11 crc kubenswrapper[4807]: I1202 21:08:11.124435 4807 scope.go:117] "RemoveContainer" containerID="be9cc4aa1b029e4c5e0e16dd7042fc4aaabc561d6dea386696fb51e4f70b3009" Dec 02 21:08:11 crc kubenswrapper[4807]: I1202 21:08:11.152406 4807 scope.go:117] "RemoveContainer" containerID="0fbb9fb15e22ef002feedbf38618f49cd5bb2f4601fee6c596d081538734f0f8" Dec 02 21:08:11 crc kubenswrapper[4807]: I1202 21:08:11.229340 4807 scope.go:117] "RemoveContainer" containerID="bf39ab0a2c16c04161421acb2e2406ee69409e47a0856d4377c23ef9687fcc31" Dec 02 21:08:11 crc kubenswrapper[4807]: E1202 21:08:11.230062 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf39ab0a2c16c04161421acb2e2406ee69409e47a0856d4377c23ef9687fcc31\": container with ID starting with bf39ab0a2c16c04161421acb2e2406ee69409e47a0856d4377c23ef9687fcc31 not found: ID does not exist" containerID="bf39ab0a2c16c04161421acb2e2406ee69409e47a0856d4377c23ef9687fcc31" Dec 02 21:08:11 crc kubenswrapper[4807]: I1202 21:08:11.230136 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf39ab0a2c16c04161421acb2e2406ee69409e47a0856d4377c23ef9687fcc31"} err="failed to get container status \"bf39ab0a2c16c04161421acb2e2406ee69409e47a0856d4377c23ef9687fcc31\": rpc error: code = NotFound desc = could not find container \"bf39ab0a2c16c04161421acb2e2406ee69409e47a0856d4377c23ef9687fcc31\": container with ID starting with bf39ab0a2c16c04161421acb2e2406ee69409e47a0856d4377c23ef9687fcc31 not found: ID does not exist" Dec 02 21:08:11 crc kubenswrapper[4807]: I1202 21:08:11.230181 4807 scope.go:117] "RemoveContainer" containerID="be9cc4aa1b029e4c5e0e16dd7042fc4aaabc561d6dea386696fb51e4f70b3009" Dec 02 21:08:11 crc kubenswrapper[4807]: E1202 21:08:11.230657 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be9cc4aa1b029e4c5e0e16dd7042fc4aaabc561d6dea386696fb51e4f70b3009\": container with ID starting with be9cc4aa1b029e4c5e0e16dd7042fc4aaabc561d6dea386696fb51e4f70b3009 not found: ID does not exist" containerID="be9cc4aa1b029e4c5e0e16dd7042fc4aaabc561d6dea386696fb51e4f70b3009" Dec 02 21:08:11 crc kubenswrapper[4807]: I1202 21:08:11.230696 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be9cc4aa1b029e4c5e0e16dd7042fc4aaabc561d6dea386696fb51e4f70b3009"} err="failed to get container status \"be9cc4aa1b029e4c5e0e16dd7042fc4aaabc561d6dea386696fb51e4f70b3009\": rpc error: code = NotFound desc = could not find container \"be9cc4aa1b029e4c5e0e16dd7042fc4aaabc561d6dea386696fb51e4f70b3009\": container with ID starting with be9cc4aa1b029e4c5e0e16dd7042fc4aaabc561d6dea386696fb51e4f70b3009 not found: ID does not exist" Dec 02 21:08:11 crc kubenswrapper[4807]: I1202 21:08:11.230775 4807 scope.go:117] "RemoveContainer" containerID="0fbb9fb15e22ef002feedbf38618f49cd5bb2f4601fee6c596d081538734f0f8" Dec 02 21:08:11 crc kubenswrapper[4807]: E1202 21:08:11.231170 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fbb9fb15e22ef002feedbf38618f49cd5bb2f4601fee6c596d081538734f0f8\": container with ID starting with 0fbb9fb15e22ef002feedbf38618f49cd5bb2f4601fee6c596d081538734f0f8 not found: ID does not exist" containerID="0fbb9fb15e22ef002feedbf38618f49cd5bb2f4601fee6c596d081538734f0f8" Dec 02 21:08:11 crc kubenswrapper[4807]: I1202 21:08:11.231216 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fbb9fb15e22ef002feedbf38618f49cd5bb2f4601fee6c596d081538734f0f8"} err="failed to get container status \"0fbb9fb15e22ef002feedbf38618f49cd5bb2f4601fee6c596d081538734f0f8\": rpc error: code = NotFound desc = could not find container \"0fbb9fb15e22ef002feedbf38618f49cd5bb2f4601fee6c596d081538734f0f8\": container with ID starting with 0fbb9fb15e22ef002feedbf38618f49cd5bb2f4601fee6c596d081538734f0f8 not found: ID does not exist" Dec 02 21:08:12 crc kubenswrapper[4807]: I1202 21:08:12.985424 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f89769ab-209d-4ab3-9759-a0f26b9400f0" path="/var/lib/kubelet/pods/f89769ab-209d-4ab3-9759-a0f26b9400f0/volumes" Dec 02 21:08:13 crc kubenswrapper[4807]: I1202 21:08:13.817186 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vd6mt" Dec 02 21:08:13 crc kubenswrapper[4807]: I1202 21:08:13.894300 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vd6mt" Dec 02 21:08:14 crc kubenswrapper[4807]: I1202 21:08:14.527824 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vd6mt"] Dec 02 21:08:14 crc kubenswrapper[4807]: I1202 21:08:14.677995 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cr7xz"] Dec 02 21:08:14 crc kubenswrapper[4807]: I1202 21:08:14.678313 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cr7xz" podUID="7e62aa90-5be8-469a-b984-6adcb92e4a91" containerName="registry-server" containerID="cri-o://c4f1c24df6b17fa0c07f299405aeeba203386417010e00fdfe19de9a591548cd" gracePeriod=2 Dec 02 21:08:15 crc kubenswrapper[4807]: I1202 21:08:15.122493 4807 generic.go:334] "Generic (PLEG): container finished" podID="7e62aa90-5be8-469a-b984-6adcb92e4a91" containerID="c4f1c24df6b17fa0c07f299405aeeba203386417010e00fdfe19de9a591548cd" exitCode=0 Dec 02 21:08:15 crc kubenswrapper[4807]: I1202 21:08:15.122579 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cr7xz" event={"ID":"7e62aa90-5be8-469a-b984-6adcb92e4a91","Type":"ContainerDied","Data":"c4f1c24df6b17fa0c07f299405aeeba203386417010e00fdfe19de9a591548cd"} Dec 02 21:08:15 crc kubenswrapper[4807]: I1202 21:08:15.122648 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cr7xz" event={"ID":"7e62aa90-5be8-469a-b984-6adcb92e4a91","Type":"ContainerDied","Data":"14d745c45a684d5b6dd881b66faf9365c2d28026ec280483c0b7de7ed681cebc"} Dec 02 21:08:15 crc kubenswrapper[4807]: I1202 21:08:15.122662 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14d745c45a684d5b6dd881b66faf9365c2d28026ec280483c0b7de7ed681cebc" Dec 02 21:08:15 crc kubenswrapper[4807]: I1202 21:08:15.647437 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 21:08:15 crc kubenswrapper[4807]: I1202 21:08:15.672520 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e62aa90-5be8-469a-b984-6adcb92e4a91-catalog-content\") pod \"7e62aa90-5be8-469a-b984-6adcb92e4a91\" (UID: \"7e62aa90-5be8-469a-b984-6adcb92e4a91\") " Dec 02 21:08:15 crc kubenswrapper[4807]: I1202 21:08:15.672695 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e62aa90-5be8-469a-b984-6adcb92e4a91-utilities\") pod \"7e62aa90-5be8-469a-b984-6adcb92e4a91\" (UID: \"7e62aa90-5be8-469a-b984-6adcb92e4a91\") " Dec 02 21:08:15 crc kubenswrapper[4807]: I1202 21:08:15.672882 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mjgw\" (UniqueName: \"kubernetes.io/projected/7e62aa90-5be8-469a-b984-6adcb92e4a91-kube-api-access-9mjgw\") pod \"7e62aa90-5be8-469a-b984-6adcb92e4a91\" (UID: \"7e62aa90-5be8-469a-b984-6adcb92e4a91\") " Dec 02 21:08:15 crc kubenswrapper[4807]: I1202 21:08:15.673309 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e62aa90-5be8-469a-b984-6adcb92e4a91-utilities" (OuterVolumeSpecName: "utilities") pod "7e62aa90-5be8-469a-b984-6adcb92e4a91" (UID: "7e62aa90-5be8-469a-b984-6adcb92e4a91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:08:15 crc kubenswrapper[4807]: I1202 21:08:15.684034 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e62aa90-5be8-469a-b984-6adcb92e4a91-kube-api-access-9mjgw" (OuterVolumeSpecName: "kube-api-access-9mjgw") pod "7e62aa90-5be8-469a-b984-6adcb92e4a91" (UID: "7e62aa90-5be8-469a-b984-6adcb92e4a91"). InnerVolumeSpecName "kube-api-access-9mjgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:08:15 crc kubenswrapper[4807]: I1202 21:08:15.775896 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e62aa90-5be8-469a-b984-6adcb92e4a91-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 21:08:15 crc kubenswrapper[4807]: I1202 21:08:15.775925 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mjgw\" (UniqueName: \"kubernetes.io/projected/7e62aa90-5be8-469a-b984-6adcb92e4a91-kube-api-access-9mjgw\") on node \"crc\" DevicePath \"\"" Dec 02 21:08:15 crc kubenswrapper[4807]: I1202 21:08:15.780214 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e62aa90-5be8-469a-b984-6adcb92e4a91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e62aa90-5be8-469a-b984-6adcb92e4a91" (UID: "7e62aa90-5be8-469a-b984-6adcb92e4a91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:08:15 crc kubenswrapper[4807]: I1202 21:08:15.878838 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e62aa90-5be8-469a-b984-6adcb92e4a91-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 21:08:16 crc kubenswrapper[4807]: I1202 21:08:16.131504 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cr7xz" Dec 02 21:08:16 crc kubenswrapper[4807]: I1202 21:08:16.175067 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cr7xz"] Dec 02 21:08:16 crc kubenswrapper[4807]: I1202 21:08:16.185666 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cr7xz"] Dec 02 21:08:16 crc kubenswrapper[4807]: I1202 21:08:16.986434 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e62aa90-5be8-469a-b984-6adcb92e4a91" path="/var/lib/kubelet/pods/7e62aa90-5be8-469a-b984-6adcb92e4a91/volumes" Dec 02 21:09:12 crc kubenswrapper[4807]: I1202 21:09:12.808397 4807 scope.go:117] "RemoveContainer" containerID="5ac98f55d5351b27e1e1cddf0e68dac26fcc4f008ab4d21dc18134f06a4eb6fc" Dec 02 21:09:12 crc kubenswrapper[4807]: I1202 21:09:12.837441 4807 scope.go:117] "RemoveContainer" containerID="360867d23729c69afaff35f2ee3bebfd946f6dc7bf8091bbe640dde8196d99bb" Dec 02 21:09:12 crc kubenswrapper[4807]: I1202 21:09:12.904079 4807 scope.go:117] "RemoveContainer" containerID="c4f1c24df6b17fa0c07f299405aeeba203386417010e00fdfe19de9a591548cd" Dec 02 21:09:28 crc kubenswrapper[4807]: I1202 21:09:28.292760 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:09:28 crc kubenswrapper[4807]: I1202 21:09:28.293615 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:09:58 crc kubenswrapper[4807]: I1202 21:09:58.293659 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:09:58 crc kubenswrapper[4807]: I1202 21:09:58.294373 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:10:28 crc kubenswrapper[4807]: I1202 21:10:28.292974 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:10:28 crc kubenswrapper[4807]: I1202 21:10:28.293739 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:10:28 crc kubenswrapper[4807]: I1202 21:10:28.293813 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 21:10:28 crc kubenswrapper[4807]: I1202 21:10:28.294932 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 21:10:28 crc kubenswrapper[4807]: I1202 21:10:28.295033 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" gracePeriod=600 Dec 02 21:10:28 crc kubenswrapper[4807]: E1202 21:10:28.448412 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:10:28 crc kubenswrapper[4807]: I1202 21:10:28.797065 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" exitCode=0 Dec 02 21:10:28 crc kubenswrapper[4807]: I1202 21:10:28.797215 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869"} Dec 02 21:10:28 crc kubenswrapper[4807]: I1202 21:10:28.797299 4807 scope.go:117] "RemoveContainer" containerID="375e1fadd6b44094984c3dfa802ca4b373ed316a82df44a301888e8f70285350" Dec 02 21:10:28 crc kubenswrapper[4807]: I1202 21:10:28.798036 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:10:28 crc kubenswrapper[4807]: E1202 21:10:28.798302 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:10:41 crc kubenswrapper[4807]: I1202 21:10:41.974211 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:10:41 crc kubenswrapper[4807]: E1202 21:10:41.975179 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:10:55 crc kubenswrapper[4807]: I1202 21:10:55.972408 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:10:55 crc kubenswrapper[4807]: E1202 21:10:55.974427 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:11:10 crc kubenswrapper[4807]: I1202 21:11:10.977077 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:11:10 crc kubenswrapper[4807]: E1202 21:11:10.978674 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:11:21 crc kubenswrapper[4807]: I1202 21:11:21.973083 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:11:21 crc kubenswrapper[4807]: E1202 21:11:21.974368 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:11:32 crc kubenswrapper[4807]: I1202 21:11:32.973262 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:11:32 crc kubenswrapper[4807]: E1202 21:11:32.974352 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:11:46 crc kubenswrapper[4807]: I1202 21:11:46.974083 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:11:46 crc kubenswrapper[4807]: E1202 21:11:46.975473 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.024108 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nsfjk"] Dec 02 21:11:52 crc kubenswrapper[4807]: E1202 21:11:52.025379 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e62aa90-5be8-469a-b984-6adcb92e4a91" containerName="extract-content" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.025401 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e62aa90-5be8-469a-b984-6adcb92e4a91" containerName="extract-content" Dec 02 21:11:52 crc kubenswrapper[4807]: E1202 21:11:52.025444 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f89769ab-209d-4ab3-9759-a0f26b9400f0" containerName="extract-utilities" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.025459 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89769ab-209d-4ab3-9759-a0f26b9400f0" containerName="extract-utilities" Dec 02 21:11:52 crc kubenswrapper[4807]: E1202 21:11:52.025493 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e62aa90-5be8-469a-b984-6adcb92e4a91" containerName="registry-server" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.025505 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e62aa90-5be8-469a-b984-6adcb92e4a91" containerName="registry-server" Dec 02 21:11:52 crc kubenswrapper[4807]: E1202 21:11:52.025558 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e62aa90-5be8-469a-b984-6adcb92e4a91" containerName="extract-utilities" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.025570 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e62aa90-5be8-469a-b984-6adcb92e4a91" containerName="extract-utilities" Dec 02 21:11:52 crc kubenswrapper[4807]: E1202 21:11:52.025597 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f89769ab-209d-4ab3-9759-a0f26b9400f0" containerName="registry-server" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.025610 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89769ab-209d-4ab3-9759-a0f26b9400f0" containerName="registry-server" Dec 02 21:11:52 crc kubenswrapper[4807]: E1202 21:11:52.025641 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f89769ab-209d-4ab3-9759-a0f26b9400f0" containerName="extract-content" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.025652 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f89769ab-209d-4ab3-9759-a0f26b9400f0" containerName="extract-content" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.026006 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f89769ab-209d-4ab3-9759-a0f26b9400f0" containerName="registry-server" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.026049 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e62aa90-5be8-469a-b984-6adcb92e4a91" containerName="registry-server" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.029308 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.053934 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nsfjk"] Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.140881 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c3b84d1-1596-4d47-90e3-55a2edc50d32-utilities\") pod \"community-operators-nsfjk\" (UID: \"4c3b84d1-1596-4d47-90e3-55a2edc50d32\") " pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.140975 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c3b84d1-1596-4d47-90e3-55a2edc50d32-catalog-content\") pod \"community-operators-nsfjk\" (UID: \"4c3b84d1-1596-4d47-90e3-55a2edc50d32\") " pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.141043 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhf9t\" (UniqueName: \"kubernetes.io/projected/4c3b84d1-1596-4d47-90e3-55a2edc50d32-kube-api-access-dhf9t\") pod \"community-operators-nsfjk\" (UID: \"4c3b84d1-1596-4d47-90e3-55a2edc50d32\") " pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.243283 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c3b84d1-1596-4d47-90e3-55a2edc50d32-catalog-content\") pod \"community-operators-nsfjk\" (UID: \"4c3b84d1-1596-4d47-90e3-55a2edc50d32\") " pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.243398 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhf9t\" (UniqueName: \"kubernetes.io/projected/4c3b84d1-1596-4d47-90e3-55a2edc50d32-kube-api-access-dhf9t\") pod \"community-operators-nsfjk\" (UID: \"4c3b84d1-1596-4d47-90e3-55a2edc50d32\") " pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.243573 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c3b84d1-1596-4d47-90e3-55a2edc50d32-utilities\") pod \"community-operators-nsfjk\" (UID: \"4c3b84d1-1596-4d47-90e3-55a2edc50d32\") " pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.243815 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c3b84d1-1596-4d47-90e3-55a2edc50d32-catalog-content\") pod \"community-operators-nsfjk\" (UID: \"4c3b84d1-1596-4d47-90e3-55a2edc50d32\") " pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.245104 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c3b84d1-1596-4d47-90e3-55a2edc50d32-utilities\") pod \"community-operators-nsfjk\" (UID: \"4c3b84d1-1596-4d47-90e3-55a2edc50d32\") " pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.276638 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhf9t\" (UniqueName: \"kubernetes.io/projected/4c3b84d1-1596-4d47-90e3-55a2edc50d32-kube-api-access-dhf9t\") pod \"community-operators-nsfjk\" (UID: \"4c3b84d1-1596-4d47-90e3-55a2edc50d32\") " pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.371608 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:11:52 crc kubenswrapper[4807]: I1202 21:11:52.907925 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nsfjk"] Dec 02 21:11:52 crc kubenswrapper[4807]: W1202 21:11:52.921800 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c3b84d1_1596_4d47_90e3_55a2edc50d32.slice/crio-9ef0d35cff889ef19c68677ea236bf09c48db0429c853323410fa6d90e175d13 WatchSource:0}: Error finding container 9ef0d35cff889ef19c68677ea236bf09c48db0429c853323410fa6d90e175d13: Status 404 returned error can't find the container with id 9ef0d35cff889ef19c68677ea236bf09c48db0429c853323410fa6d90e175d13 Dec 02 21:11:53 crc kubenswrapper[4807]: I1202 21:11:53.958171 4807 generic.go:334] "Generic (PLEG): container finished" podID="4c3b84d1-1596-4d47-90e3-55a2edc50d32" containerID="33eb27c0ff531d008fc6f0e0feecebb061c1e0a82aa8d199fda5c44bcee73d41" exitCode=0 Dec 02 21:11:53 crc kubenswrapper[4807]: I1202 21:11:53.958249 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nsfjk" event={"ID":"4c3b84d1-1596-4d47-90e3-55a2edc50d32","Type":"ContainerDied","Data":"33eb27c0ff531d008fc6f0e0feecebb061c1e0a82aa8d199fda5c44bcee73d41"} Dec 02 21:11:53 crc kubenswrapper[4807]: I1202 21:11:53.958618 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nsfjk" event={"ID":"4c3b84d1-1596-4d47-90e3-55a2edc50d32","Type":"ContainerStarted","Data":"9ef0d35cff889ef19c68677ea236bf09c48db0429c853323410fa6d90e175d13"} Dec 02 21:11:54 crc kubenswrapper[4807]: I1202 21:11:54.987196 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nsfjk" event={"ID":"4c3b84d1-1596-4d47-90e3-55a2edc50d32","Type":"ContainerStarted","Data":"53ca4c5f6f696080957ba314822e2fc5adcb71f623a50a16268518f7b0a98146"} Dec 02 21:11:55 crc kubenswrapper[4807]: I1202 21:11:55.986967 4807 generic.go:334] "Generic (PLEG): container finished" podID="4c3b84d1-1596-4d47-90e3-55a2edc50d32" containerID="53ca4c5f6f696080957ba314822e2fc5adcb71f623a50a16268518f7b0a98146" exitCode=0 Dec 02 21:11:55 crc kubenswrapper[4807]: I1202 21:11:55.987298 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nsfjk" event={"ID":"4c3b84d1-1596-4d47-90e3-55a2edc50d32","Type":"ContainerDied","Data":"53ca4c5f6f696080957ba314822e2fc5adcb71f623a50a16268518f7b0a98146"} Dec 02 21:11:57 crc kubenswrapper[4807]: I1202 21:11:57.000433 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nsfjk" event={"ID":"4c3b84d1-1596-4d47-90e3-55a2edc50d32","Type":"ContainerStarted","Data":"d58004d0e7b7b1ef1bbedb1bbc3f60e2a096ade47dcfb6bf59a88ab251534c94"} Dec 02 21:11:57 crc kubenswrapper[4807]: I1202 21:11:57.024075 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nsfjk" podStartSLOduration=3.462108688 podStartE2EDuration="6.024055821s" podCreationTimestamp="2025-12-02 21:11:51 +0000 UTC" firstStartedPulling="2025-12-02 21:11:53.960585429 +0000 UTC m=+4449.261492934" lastFinishedPulling="2025-12-02 21:11:56.522532572 +0000 UTC m=+4451.823440067" observedRunningTime="2025-12-02 21:11:57.016905869 +0000 UTC m=+4452.317813364" watchObservedRunningTime="2025-12-02 21:11:57.024055821 +0000 UTC m=+4452.324963316" Dec 02 21:11:59 crc kubenswrapper[4807]: I1202 21:11:59.972416 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:11:59 crc kubenswrapper[4807]: E1202 21:11:59.973545 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:12:02 crc kubenswrapper[4807]: I1202 21:12:02.372831 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:12:02 crc kubenswrapper[4807]: I1202 21:12:02.373410 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:12:02 crc kubenswrapper[4807]: I1202 21:12:02.422013 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:12:03 crc kubenswrapper[4807]: I1202 21:12:03.152482 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:12:03 crc kubenswrapper[4807]: I1202 21:12:03.225695 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nsfjk"] Dec 02 21:12:05 crc kubenswrapper[4807]: I1202 21:12:05.090689 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nsfjk" podUID="4c3b84d1-1596-4d47-90e3-55a2edc50d32" containerName="registry-server" containerID="cri-o://d58004d0e7b7b1ef1bbedb1bbc3f60e2a096ade47dcfb6bf59a88ab251534c94" gracePeriod=2 Dec 02 21:12:05 crc kubenswrapper[4807]: I1202 21:12:05.651495 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:12:05 crc kubenswrapper[4807]: I1202 21:12:05.821485 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c3b84d1-1596-4d47-90e3-55a2edc50d32-catalog-content\") pod \"4c3b84d1-1596-4d47-90e3-55a2edc50d32\" (UID: \"4c3b84d1-1596-4d47-90e3-55a2edc50d32\") " Dec 02 21:12:05 crc kubenswrapper[4807]: I1202 21:12:05.821576 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c3b84d1-1596-4d47-90e3-55a2edc50d32-utilities\") pod \"4c3b84d1-1596-4d47-90e3-55a2edc50d32\" (UID: \"4c3b84d1-1596-4d47-90e3-55a2edc50d32\") " Dec 02 21:12:05 crc kubenswrapper[4807]: I1202 21:12:05.821792 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhf9t\" (UniqueName: \"kubernetes.io/projected/4c3b84d1-1596-4d47-90e3-55a2edc50d32-kube-api-access-dhf9t\") pod \"4c3b84d1-1596-4d47-90e3-55a2edc50d32\" (UID: \"4c3b84d1-1596-4d47-90e3-55a2edc50d32\") " Dec 02 21:12:05 crc kubenswrapper[4807]: I1202 21:12:05.823032 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c3b84d1-1596-4d47-90e3-55a2edc50d32-utilities" (OuterVolumeSpecName: "utilities") pod "4c3b84d1-1596-4d47-90e3-55a2edc50d32" (UID: "4c3b84d1-1596-4d47-90e3-55a2edc50d32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:12:05 crc kubenswrapper[4807]: I1202 21:12:05.836103 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c3b84d1-1596-4d47-90e3-55a2edc50d32-kube-api-access-dhf9t" (OuterVolumeSpecName: "kube-api-access-dhf9t") pod "4c3b84d1-1596-4d47-90e3-55a2edc50d32" (UID: "4c3b84d1-1596-4d47-90e3-55a2edc50d32"). InnerVolumeSpecName "kube-api-access-dhf9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:12:05 crc kubenswrapper[4807]: I1202 21:12:05.890083 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c3b84d1-1596-4d47-90e3-55a2edc50d32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c3b84d1-1596-4d47-90e3-55a2edc50d32" (UID: "4c3b84d1-1596-4d47-90e3-55a2edc50d32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:12:05 crc kubenswrapper[4807]: I1202 21:12:05.923639 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c3b84d1-1596-4d47-90e3-55a2edc50d32-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 21:12:05 crc kubenswrapper[4807]: I1202 21:12:05.923672 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c3b84d1-1596-4d47-90e3-55a2edc50d32-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 21:12:05 crc kubenswrapper[4807]: I1202 21:12:05.923683 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhf9t\" (UniqueName: \"kubernetes.io/projected/4c3b84d1-1596-4d47-90e3-55a2edc50d32-kube-api-access-dhf9t\") on node \"crc\" DevicePath \"\"" Dec 02 21:12:06 crc kubenswrapper[4807]: I1202 21:12:06.103514 4807 generic.go:334] "Generic (PLEG): container finished" podID="4c3b84d1-1596-4d47-90e3-55a2edc50d32" containerID="d58004d0e7b7b1ef1bbedb1bbc3f60e2a096ade47dcfb6bf59a88ab251534c94" exitCode=0 Dec 02 21:12:06 crc kubenswrapper[4807]: I1202 21:12:06.103585 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nsfjk" event={"ID":"4c3b84d1-1596-4d47-90e3-55a2edc50d32","Type":"ContainerDied","Data":"d58004d0e7b7b1ef1bbedb1bbc3f60e2a096ade47dcfb6bf59a88ab251534c94"} Dec 02 21:12:06 crc kubenswrapper[4807]: I1202 21:12:06.103622 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nsfjk" event={"ID":"4c3b84d1-1596-4d47-90e3-55a2edc50d32","Type":"ContainerDied","Data":"9ef0d35cff889ef19c68677ea236bf09c48db0429c853323410fa6d90e175d13"} Dec 02 21:12:06 crc kubenswrapper[4807]: I1202 21:12:06.103643 4807 scope.go:117] "RemoveContainer" containerID="d58004d0e7b7b1ef1bbedb1bbc3f60e2a096ade47dcfb6bf59a88ab251534c94" Dec 02 21:12:06 crc kubenswrapper[4807]: I1202 21:12:06.103877 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nsfjk" Dec 02 21:12:06 crc kubenswrapper[4807]: I1202 21:12:06.127267 4807 scope.go:117] "RemoveContainer" containerID="53ca4c5f6f696080957ba314822e2fc5adcb71f623a50a16268518f7b0a98146" Dec 02 21:12:06 crc kubenswrapper[4807]: I1202 21:12:06.150073 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nsfjk"] Dec 02 21:12:06 crc kubenswrapper[4807]: I1202 21:12:06.164603 4807 scope.go:117] "RemoveContainer" containerID="33eb27c0ff531d008fc6f0e0feecebb061c1e0a82aa8d199fda5c44bcee73d41" Dec 02 21:12:06 crc kubenswrapper[4807]: I1202 21:12:06.168709 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nsfjk"] Dec 02 21:12:06 crc kubenswrapper[4807]: I1202 21:12:06.229461 4807 scope.go:117] "RemoveContainer" containerID="d58004d0e7b7b1ef1bbedb1bbc3f60e2a096ade47dcfb6bf59a88ab251534c94" Dec 02 21:12:06 crc kubenswrapper[4807]: E1202 21:12:06.230036 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d58004d0e7b7b1ef1bbedb1bbc3f60e2a096ade47dcfb6bf59a88ab251534c94\": container with ID starting with d58004d0e7b7b1ef1bbedb1bbc3f60e2a096ade47dcfb6bf59a88ab251534c94 not found: ID does not exist" containerID="d58004d0e7b7b1ef1bbedb1bbc3f60e2a096ade47dcfb6bf59a88ab251534c94" Dec 02 21:12:06 crc kubenswrapper[4807]: I1202 21:12:06.230084 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d58004d0e7b7b1ef1bbedb1bbc3f60e2a096ade47dcfb6bf59a88ab251534c94"} err="failed to get container status \"d58004d0e7b7b1ef1bbedb1bbc3f60e2a096ade47dcfb6bf59a88ab251534c94\": rpc error: code = NotFound desc = could not find container \"d58004d0e7b7b1ef1bbedb1bbc3f60e2a096ade47dcfb6bf59a88ab251534c94\": container with ID starting with d58004d0e7b7b1ef1bbedb1bbc3f60e2a096ade47dcfb6bf59a88ab251534c94 not found: ID does not exist" Dec 02 21:12:06 crc kubenswrapper[4807]: I1202 21:12:06.230111 4807 scope.go:117] "RemoveContainer" containerID="53ca4c5f6f696080957ba314822e2fc5adcb71f623a50a16268518f7b0a98146" Dec 02 21:12:06 crc kubenswrapper[4807]: E1202 21:12:06.230544 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53ca4c5f6f696080957ba314822e2fc5adcb71f623a50a16268518f7b0a98146\": container with ID starting with 53ca4c5f6f696080957ba314822e2fc5adcb71f623a50a16268518f7b0a98146 not found: ID does not exist" containerID="53ca4c5f6f696080957ba314822e2fc5adcb71f623a50a16268518f7b0a98146" Dec 02 21:12:06 crc kubenswrapper[4807]: I1202 21:12:06.230576 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53ca4c5f6f696080957ba314822e2fc5adcb71f623a50a16268518f7b0a98146"} err="failed to get container status \"53ca4c5f6f696080957ba314822e2fc5adcb71f623a50a16268518f7b0a98146\": rpc error: code = NotFound desc = could not find container \"53ca4c5f6f696080957ba314822e2fc5adcb71f623a50a16268518f7b0a98146\": container with ID starting with 53ca4c5f6f696080957ba314822e2fc5adcb71f623a50a16268518f7b0a98146 not found: ID does not exist" Dec 02 21:12:06 crc kubenswrapper[4807]: I1202 21:12:06.230595 4807 scope.go:117] "RemoveContainer" containerID="33eb27c0ff531d008fc6f0e0feecebb061c1e0a82aa8d199fda5c44bcee73d41" Dec 02 21:12:06 crc kubenswrapper[4807]: E1202 21:12:06.230958 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33eb27c0ff531d008fc6f0e0feecebb061c1e0a82aa8d199fda5c44bcee73d41\": container with ID starting with 33eb27c0ff531d008fc6f0e0feecebb061c1e0a82aa8d199fda5c44bcee73d41 not found: ID does not exist" containerID="33eb27c0ff531d008fc6f0e0feecebb061c1e0a82aa8d199fda5c44bcee73d41" Dec 02 21:12:06 crc kubenswrapper[4807]: I1202 21:12:06.231009 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33eb27c0ff531d008fc6f0e0feecebb061c1e0a82aa8d199fda5c44bcee73d41"} err="failed to get container status \"33eb27c0ff531d008fc6f0e0feecebb061c1e0a82aa8d199fda5c44bcee73d41\": rpc error: code = NotFound desc = could not find container \"33eb27c0ff531d008fc6f0e0feecebb061c1e0a82aa8d199fda5c44bcee73d41\": container with ID starting with 33eb27c0ff531d008fc6f0e0feecebb061c1e0a82aa8d199fda5c44bcee73d41 not found: ID does not exist" Dec 02 21:12:06 crc kubenswrapper[4807]: I1202 21:12:06.989838 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c3b84d1-1596-4d47-90e3-55a2edc50d32" path="/var/lib/kubelet/pods/4c3b84d1-1596-4d47-90e3-55a2edc50d32/volumes" Dec 02 21:12:14 crc kubenswrapper[4807]: I1202 21:12:14.972871 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:12:14 crc kubenswrapper[4807]: E1202 21:12:14.973977 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:12:27 crc kubenswrapper[4807]: I1202 21:12:27.972646 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:12:27 crc kubenswrapper[4807]: E1202 21:12:27.974568 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:12:38 crc kubenswrapper[4807]: I1202 21:12:38.973344 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:12:38 crc kubenswrapper[4807]: E1202 21:12:38.974297 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:12:52 crc kubenswrapper[4807]: I1202 21:12:52.972940 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:12:52 crc kubenswrapper[4807]: E1202 21:12:52.973695 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:13:03 crc kubenswrapper[4807]: I1202 21:13:03.973167 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:13:03 crc kubenswrapper[4807]: E1202 21:13:03.973831 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:13:17 crc kubenswrapper[4807]: I1202 21:13:17.990340 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:13:17 crc kubenswrapper[4807]: E1202 21:13:17.994266 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:13:29 crc kubenswrapper[4807]: I1202 21:13:29.973323 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:13:29 crc kubenswrapper[4807]: E1202 21:13:29.974523 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:13:43 crc kubenswrapper[4807]: I1202 21:13:43.972613 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:13:43 crc kubenswrapper[4807]: E1202 21:13:43.973514 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:13:54 crc kubenswrapper[4807]: I1202 21:13:54.982144 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:13:54 crc kubenswrapper[4807]: E1202 21:13:54.982958 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:14:06 crc kubenswrapper[4807]: I1202 21:14:06.974242 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:14:06 crc kubenswrapper[4807]: E1202 21:14:06.975740 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:14:17 crc kubenswrapper[4807]: I1202 21:14:17.972051 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:14:17 crc kubenswrapper[4807]: E1202 21:14:17.972625 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:14:28 crc kubenswrapper[4807]: I1202 21:14:28.975539 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:14:28 crc kubenswrapper[4807]: E1202 21:14:28.976517 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:14:43 crc kubenswrapper[4807]: I1202 21:14:43.972525 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:14:43 crc kubenswrapper[4807]: E1202 21:14:43.973771 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:14:56 crc kubenswrapper[4807]: I1202 21:14:56.972642 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:14:56 crc kubenswrapper[4807]: E1202 21:14:56.973947 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.209880 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q"] Dec 02 21:15:00 crc kubenswrapper[4807]: E1202 21:15:00.210916 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3b84d1-1596-4d47-90e3-55a2edc50d32" containerName="registry-server" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.210943 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3b84d1-1596-4d47-90e3-55a2edc50d32" containerName="registry-server" Dec 02 21:15:00 crc kubenswrapper[4807]: E1202 21:15:00.210957 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3b84d1-1596-4d47-90e3-55a2edc50d32" containerName="extract-utilities" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.210969 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3b84d1-1596-4d47-90e3-55a2edc50d32" containerName="extract-utilities" Dec 02 21:15:00 crc kubenswrapper[4807]: E1202 21:15:00.211031 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3b84d1-1596-4d47-90e3-55a2edc50d32" containerName="extract-content" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.211045 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3b84d1-1596-4d47-90e3-55a2edc50d32" containerName="extract-content" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.211381 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c3b84d1-1596-4d47-90e3-55a2edc50d32" containerName="registry-server" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.212556 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.214626 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.215624 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.250784 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q"] Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.285949 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69390329-369f-47ed-a905-8ac11578f356-secret-volume\") pod \"collect-profiles-29411835-9jq5q\" (UID: \"69390329-369f-47ed-a905-8ac11578f356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.286003 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqsqv\" (UniqueName: \"kubernetes.io/projected/69390329-369f-47ed-a905-8ac11578f356-kube-api-access-nqsqv\") pod \"collect-profiles-29411835-9jq5q\" (UID: \"69390329-369f-47ed-a905-8ac11578f356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.286059 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69390329-369f-47ed-a905-8ac11578f356-config-volume\") pod \"collect-profiles-29411835-9jq5q\" (UID: \"69390329-369f-47ed-a905-8ac11578f356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.388038 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqsqv\" (UniqueName: \"kubernetes.io/projected/69390329-369f-47ed-a905-8ac11578f356-kube-api-access-nqsqv\") pod \"collect-profiles-29411835-9jq5q\" (UID: \"69390329-369f-47ed-a905-8ac11578f356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.388148 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69390329-369f-47ed-a905-8ac11578f356-config-volume\") pod \"collect-profiles-29411835-9jq5q\" (UID: \"69390329-369f-47ed-a905-8ac11578f356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.388351 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69390329-369f-47ed-a905-8ac11578f356-secret-volume\") pod \"collect-profiles-29411835-9jq5q\" (UID: \"69390329-369f-47ed-a905-8ac11578f356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.389048 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69390329-369f-47ed-a905-8ac11578f356-config-volume\") pod \"collect-profiles-29411835-9jq5q\" (UID: \"69390329-369f-47ed-a905-8ac11578f356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.395709 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69390329-369f-47ed-a905-8ac11578f356-secret-volume\") pod \"collect-profiles-29411835-9jq5q\" (UID: \"69390329-369f-47ed-a905-8ac11578f356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.405649 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqsqv\" (UniqueName: \"kubernetes.io/projected/69390329-369f-47ed-a905-8ac11578f356-kube-api-access-nqsqv\") pod \"collect-profiles-29411835-9jq5q\" (UID: \"69390329-369f-47ed-a905-8ac11578f356\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q" Dec 02 21:15:00 crc kubenswrapper[4807]: I1202 21:15:00.541086 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q" Dec 02 21:15:01 crc kubenswrapper[4807]: I1202 21:15:01.058285 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q"] Dec 02 21:15:01 crc kubenswrapper[4807]: I1202 21:15:01.200711 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q" event={"ID":"69390329-369f-47ed-a905-8ac11578f356","Type":"ContainerStarted","Data":"b9bccf41215130a1b63b353bf3a65ba66e69a2905f6e585629927abac53e34a0"} Dec 02 21:15:02 crc kubenswrapper[4807]: I1202 21:15:02.217008 4807 generic.go:334] "Generic (PLEG): container finished" podID="69390329-369f-47ed-a905-8ac11578f356" containerID="278965db1d112109822a02ec4f355216cde297d54a0e4c432385ecc7d2e33a8d" exitCode=0 Dec 02 21:15:02 crc kubenswrapper[4807]: I1202 21:15:02.217082 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q" event={"ID":"69390329-369f-47ed-a905-8ac11578f356","Type":"ContainerDied","Data":"278965db1d112109822a02ec4f355216cde297d54a0e4c432385ecc7d2e33a8d"} Dec 02 21:15:03 crc kubenswrapper[4807]: I1202 21:15:03.691313 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q" Dec 02 21:15:03 crc kubenswrapper[4807]: I1202 21:15:03.764712 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69390329-369f-47ed-a905-8ac11578f356-config-volume\") pod \"69390329-369f-47ed-a905-8ac11578f356\" (UID: \"69390329-369f-47ed-a905-8ac11578f356\") " Dec 02 21:15:03 crc kubenswrapper[4807]: I1202 21:15:03.764946 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69390329-369f-47ed-a905-8ac11578f356-secret-volume\") pod \"69390329-369f-47ed-a905-8ac11578f356\" (UID: \"69390329-369f-47ed-a905-8ac11578f356\") " Dec 02 21:15:03 crc kubenswrapper[4807]: I1202 21:15:03.765189 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqsqv\" (UniqueName: \"kubernetes.io/projected/69390329-369f-47ed-a905-8ac11578f356-kube-api-access-nqsqv\") pod \"69390329-369f-47ed-a905-8ac11578f356\" (UID: \"69390329-369f-47ed-a905-8ac11578f356\") " Dec 02 21:15:03 crc kubenswrapper[4807]: I1202 21:15:03.766117 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69390329-369f-47ed-a905-8ac11578f356-config-volume" (OuterVolumeSpecName: "config-volume") pod "69390329-369f-47ed-a905-8ac11578f356" (UID: "69390329-369f-47ed-a905-8ac11578f356"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 21:15:03 crc kubenswrapper[4807]: I1202 21:15:03.771844 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69390329-369f-47ed-a905-8ac11578f356-kube-api-access-nqsqv" (OuterVolumeSpecName: "kube-api-access-nqsqv") pod "69390329-369f-47ed-a905-8ac11578f356" (UID: "69390329-369f-47ed-a905-8ac11578f356"). InnerVolumeSpecName "kube-api-access-nqsqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:15:03 crc kubenswrapper[4807]: I1202 21:15:03.773026 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69390329-369f-47ed-a905-8ac11578f356-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "69390329-369f-47ed-a905-8ac11578f356" (UID: "69390329-369f-47ed-a905-8ac11578f356"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 21:15:03 crc kubenswrapper[4807]: I1202 21:15:03.868138 4807 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/69390329-369f-47ed-a905-8ac11578f356-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 21:15:03 crc kubenswrapper[4807]: I1202 21:15:03.868210 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqsqv\" (UniqueName: \"kubernetes.io/projected/69390329-369f-47ed-a905-8ac11578f356-kube-api-access-nqsqv\") on node \"crc\" DevicePath \"\"" Dec 02 21:15:03 crc kubenswrapper[4807]: I1202 21:15:03.868224 4807 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69390329-369f-47ed-a905-8ac11578f356-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 21:15:04 crc kubenswrapper[4807]: I1202 21:15:04.247836 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q" event={"ID":"69390329-369f-47ed-a905-8ac11578f356","Type":"ContainerDied","Data":"b9bccf41215130a1b63b353bf3a65ba66e69a2905f6e585629927abac53e34a0"} Dec 02 21:15:04 crc kubenswrapper[4807]: I1202 21:15:04.248189 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9bccf41215130a1b63b353bf3a65ba66e69a2905f6e585629927abac53e34a0" Dec 02 21:15:04 crc kubenswrapper[4807]: I1202 21:15:04.248066 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411835-9jq5q" Dec 02 21:15:04 crc kubenswrapper[4807]: I1202 21:15:04.797405 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh"] Dec 02 21:15:04 crc kubenswrapper[4807]: I1202 21:15:04.805081 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411790-cxdwh"] Dec 02 21:15:04 crc kubenswrapper[4807]: I1202 21:15:04.984377 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="769c589c-2259-4bbd-8766-92681831eccb" path="/var/lib/kubelet/pods/769c589c-2259-4bbd-8766-92681831eccb/volumes" Dec 02 21:15:10 crc kubenswrapper[4807]: E1202 21:15:10.104859 4807 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.36:57190->38.102.83.36:35593: write tcp 38.102.83.36:57190->38.102.83.36:35593: write: broken pipe Dec 02 21:15:10 crc kubenswrapper[4807]: I1202 21:15:10.972668 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:15:10 crc kubenswrapper[4807]: E1202 21:15:10.973595 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:15:13 crc kubenswrapper[4807]: I1202 21:15:13.157711 4807 scope.go:117] "RemoveContainer" containerID="739669dc3950619e2fe474531d7008051b6b1eb09596d83a1091dc2cbefdb502" Dec 02 21:15:19 crc kubenswrapper[4807]: I1202 21:15:19.649613 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sh5mj"] Dec 02 21:15:19 crc kubenswrapper[4807]: E1202 21:15:19.650904 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69390329-369f-47ed-a905-8ac11578f356" containerName="collect-profiles" Dec 02 21:15:19 crc kubenswrapper[4807]: I1202 21:15:19.650922 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="69390329-369f-47ed-a905-8ac11578f356" containerName="collect-profiles" Dec 02 21:15:19 crc kubenswrapper[4807]: I1202 21:15:19.651177 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="69390329-369f-47ed-a905-8ac11578f356" containerName="collect-profiles" Dec 02 21:15:19 crc kubenswrapper[4807]: I1202 21:15:19.653177 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:19 crc kubenswrapper[4807]: I1202 21:15:19.663002 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sh5mj"] Dec 02 21:15:19 crc kubenswrapper[4807]: I1202 21:15:19.742079 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-catalog-content\") pod \"certified-operators-sh5mj\" (UID: \"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4\") " pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:19 crc kubenswrapper[4807]: I1202 21:15:19.742320 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-utilities\") pod \"certified-operators-sh5mj\" (UID: \"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4\") " pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:19 crc kubenswrapper[4807]: I1202 21:15:19.742613 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcm28\" (UniqueName: \"kubernetes.io/projected/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-kube-api-access-tcm28\") pod \"certified-operators-sh5mj\" (UID: \"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4\") " pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:19 crc kubenswrapper[4807]: I1202 21:15:19.844808 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-catalog-content\") pod \"certified-operators-sh5mj\" (UID: \"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4\") " pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:19 crc kubenswrapper[4807]: I1202 21:15:19.844891 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-utilities\") pod \"certified-operators-sh5mj\" (UID: \"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4\") " pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:19 crc kubenswrapper[4807]: I1202 21:15:19.844961 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcm28\" (UniqueName: \"kubernetes.io/projected/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-kube-api-access-tcm28\") pod \"certified-operators-sh5mj\" (UID: \"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4\") " pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:19 crc kubenswrapper[4807]: I1202 21:15:19.845473 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-catalog-content\") pod \"certified-operators-sh5mj\" (UID: \"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4\") " pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:19 crc kubenswrapper[4807]: I1202 21:15:19.845669 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-utilities\") pod \"certified-operators-sh5mj\" (UID: \"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4\") " pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:19 crc kubenswrapper[4807]: I1202 21:15:19.865029 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcm28\" (UniqueName: \"kubernetes.io/projected/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-kube-api-access-tcm28\") pod \"certified-operators-sh5mj\" (UID: \"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4\") " pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:19 crc kubenswrapper[4807]: I1202 21:15:19.978885 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:20 crc kubenswrapper[4807]: I1202 21:15:20.546801 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sh5mj"] Dec 02 21:15:21 crc kubenswrapper[4807]: I1202 21:15:21.470563 4807 generic.go:334] "Generic (PLEG): container finished" podID="c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4" containerID="eb9d30b04984d5cf168839fc5bc12b494f3726bfcc70459111c544adca1bb8c3" exitCode=0 Dec 02 21:15:21 crc kubenswrapper[4807]: I1202 21:15:21.470809 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sh5mj" event={"ID":"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4","Type":"ContainerDied","Data":"eb9d30b04984d5cf168839fc5bc12b494f3726bfcc70459111c544adca1bb8c3"} Dec 02 21:15:21 crc kubenswrapper[4807]: I1202 21:15:21.470892 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sh5mj" event={"ID":"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4","Type":"ContainerStarted","Data":"6354211be40a6b85078d2e8826843320673d23a144fed18bdc6ce758153e9e8d"} Dec 02 21:15:21 crc kubenswrapper[4807]: I1202 21:15:21.473985 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 21:15:23 crc kubenswrapper[4807]: I1202 21:15:23.493366 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sh5mj" event={"ID":"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4","Type":"ContainerStarted","Data":"e593261d7c04edef77ab2eea238e5772b1f38aa3f81e52914caa0d7b804ffd16"} Dec 02 21:15:23 crc kubenswrapper[4807]: E1202 21:15:23.796143 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1cb0e71_c9f7_47d7_bece_57b7f13bc6e4.slice/crio-conmon-e593261d7c04edef77ab2eea238e5772b1f38aa3f81e52914caa0d7b804ffd16.scope\": RecentStats: unable to find data in memory cache]" Dec 02 21:15:23 crc kubenswrapper[4807]: I1202 21:15:23.973028 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:15:23 crc kubenswrapper[4807]: E1202 21:15:23.973343 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:15:24 crc kubenswrapper[4807]: I1202 21:15:24.507027 4807 generic.go:334] "Generic (PLEG): container finished" podID="c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4" containerID="e593261d7c04edef77ab2eea238e5772b1f38aa3f81e52914caa0d7b804ffd16" exitCode=0 Dec 02 21:15:24 crc kubenswrapper[4807]: I1202 21:15:24.507091 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sh5mj" event={"ID":"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4","Type":"ContainerDied","Data":"e593261d7c04edef77ab2eea238e5772b1f38aa3f81e52914caa0d7b804ffd16"} Dec 02 21:15:25 crc kubenswrapper[4807]: I1202 21:15:25.520309 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sh5mj" event={"ID":"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4","Type":"ContainerStarted","Data":"341e8ff6b38f88dc58cc0b1fe5d48a8478edbae7d0be3b35524b3006749fc9ac"} Dec 02 21:15:25 crc kubenswrapper[4807]: I1202 21:15:25.542815 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sh5mj" podStartSLOduration=3.115168282 podStartE2EDuration="6.542793319s" podCreationTimestamp="2025-12-02 21:15:19 +0000 UTC" firstStartedPulling="2025-12-02 21:15:21.473769145 +0000 UTC m=+4656.774676640" lastFinishedPulling="2025-12-02 21:15:24.901394172 +0000 UTC m=+4660.202301677" observedRunningTime="2025-12-02 21:15:25.540893236 +0000 UTC m=+4660.841800731" watchObservedRunningTime="2025-12-02 21:15:25.542793319 +0000 UTC m=+4660.843700824" Dec 02 21:15:29 crc kubenswrapper[4807]: I1202 21:15:29.980159 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:29 crc kubenswrapper[4807]: I1202 21:15:29.980617 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:30 crc kubenswrapper[4807]: I1202 21:15:30.517967 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:30 crc kubenswrapper[4807]: I1202 21:15:30.674275 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:30 crc kubenswrapper[4807]: I1202 21:15:30.789204 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sh5mj"] Dec 02 21:15:32 crc kubenswrapper[4807]: I1202 21:15:32.609373 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sh5mj" podUID="c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4" containerName="registry-server" containerID="cri-o://341e8ff6b38f88dc58cc0b1fe5d48a8478edbae7d0be3b35524b3006749fc9ac" gracePeriod=2 Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.227524 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.382086 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-catalog-content\") pod \"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4\" (UID: \"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4\") " Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.382220 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcm28\" (UniqueName: \"kubernetes.io/projected/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-kube-api-access-tcm28\") pod \"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4\" (UID: \"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4\") " Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.383231 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-utilities\") pod \"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4\" (UID: \"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4\") " Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.384514 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-utilities" (OuterVolumeSpecName: "utilities") pod "c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4" (UID: "c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.391968 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-kube-api-access-tcm28" (OuterVolumeSpecName: "kube-api-access-tcm28") pod "c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4" (UID: "c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4"). InnerVolumeSpecName "kube-api-access-tcm28". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.435783 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4" (UID: "c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.487632 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.487754 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.487779 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcm28\" (UniqueName: \"kubernetes.io/projected/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4-kube-api-access-tcm28\") on node \"crc\" DevicePath \"\"" Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.624587 4807 generic.go:334] "Generic (PLEG): container finished" podID="c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4" containerID="341e8ff6b38f88dc58cc0b1fe5d48a8478edbae7d0be3b35524b3006749fc9ac" exitCode=0 Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.624641 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sh5mj" event={"ID":"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4","Type":"ContainerDied","Data":"341e8ff6b38f88dc58cc0b1fe5d48a8478edbae7d0be3b35524b3006749fc9ac"} Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.624678 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sh5mj" event={"ID":"c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4","Type":"ContainerDied","Data":"6354211be40a6b85078d2e8826843320673d23a144fed18bdc6ce758153e9e8d"} Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.624700 4807 scope.go:117] "RemoveContainer" containerID="341e8ff6b38f88dc58cc0b1fe5d48a8478edbae7d0be3b35524b3006749fc9ac" Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.624751 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sh5mj" Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.651942 4807 scope.go:117] "RemoveContainer" containerID="e593261d7c04edef77ab2eea238e5772b1f38aa3f81e52914caa0d7b804ffd16" Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.675989 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sh5mj"] Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.685067 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sh5mj"] Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.688766 4807 scope.go:117] "RemoveContainer" containerID="eb9d30b04984d5cf168839fc5bc12b494f3726bfcc70459111c544adca1bb8c3" Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.747508 4807 scope.go:117] "RemoveContainer" containerID="341e8ff6b38f88dc58cc0b1fe5d48a8478edbae7d0be3b35524b3006749fc9ac" Dec 02 21:15:33 crc kubenswrapper[4807]: E1202 21:15:33.748334 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"341e8ff6b38f88dc58cc0b1fe5d48a8478edbae7d0be3b35524b3006749fc9ac\": container with ID starting with 341e8ff6b38f88dc58cc0b1fe5d48a8478edbae7d0be3b35524b3006749fc9ac not found: ID does not exist" containerID="341e8ff6b38f88dc58cc0b1fe5d48a8478edbae7d0be3b35524b3006749fc9ac" Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.748435 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341e8ff6b38f88dc58cc0b1fe5d48a8478edbae7d0be3b35524b3006749fc9ac"} err="failed to get container status \"341e8ff6b38f88dc58cc0b1fe5d48a8478edbae7d0be3b35524b3006749fc9ac\": rpc error: code = NotFound desc = could not find container \"341e8ff6b38f88dc58cc0b1fe5d48a8478edbae7d0be3b35524b3006749fc9ac\": container with ID starting with 341e8ff6b38f88dc58cc0b1fe5d48a8478edbae7d0be3b35524b3006749fc9ac not found: ID does not exist" Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.748514 4807 scope.go:117] "RemoveContainer" containerID="e593261d7c04edef77ab2eea238e5772b1f38aa3f81e52914caa0d7b804ffd16" Dec 02 21:15:33 crc kubenswrapper[4807]: E1202 21:15:33.749035 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e593261d7c04edef77ab2eea238e5772b1f38aa3f81e52914caa0d7b804ffd16\": container with ID starting with e593261d7c04edef77ab2eea238e5772b1f38aa3f81e52914caa0d7b804ffd16 not found: ID does not exist" containerID="e593261d7c04edef77ab2eea238e5772b1f38aa3f81e52914caa0d7b804ffd16" Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.749099 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e593261d7c04edef77ab2eea238e5772b1f38aa3f81e52914caa0d7b804ffd16"} err="failed to get container status \"e593261d7c04edef77ab2eea238e5772b1f38aa3f81e52914caa0d7b804ffd16\": rpc error: code = NotFound desc = could not find container \"e593261d7c04edef77ab2eea238e5772b1f38aa3f81e52914caa0d7b804ffd16\": container with ID starting with e593261d7c04edef77ab2eea238e5772b1f38aa3f81e52914caa0d7b804ffd16 not found: ID does not exist" Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.749139 4807 scope.go:117] "RemoveContainer" containerID="eb9d30b04984d5cf168839fc5bc12b494f3726bfcc70459111c544adca1bb8c3" Dec 02 21:15:33 crc kubenswrapper[4807]: E1202 21:15:33.749497 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb9d30b04984d5cf168839fc5bc12b494f3726bfcc70459111c544adca1bb8c3\": container with ID starting with eb9d30b04984d5cf168839fc5bc12b494f3726bfcc70459111c544adca1bb8c3 not found: ID does not exist" containerID="eb9d30b04984d5cf168839fc5bc12b494f3726bfcc70459111c544adca1bb8c3" Dec 02 21:15:33 crc kubenswrapper[4807]: I1202 21:15:33.749539 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb9d30b04984d5cf168839fc5bc12b494f3726bfcc70459111c544adca1bb8c3"} err="failed to get container status \"eb9d30b04984d5cf168839fc5bc12b494f3726bfcc70459111c544adca1bb8c3\": rpc error: code = NotFound desc = could not find container \"eb9d30b04984d5cf168839fc5bc12b494f3726bfcc70459111c544adca1bb8c3\": container with ID starting with eb9d30b04984d5cf168839fc5bc12b494f3726bfcc70459111c544adca1bb8c3 not found: ID does not exist" Dec 02 21:15:34 crc kubenswrapper[4807]: I1202 21:15:34.988892 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4" path="/var/lib/kubelet/pods/c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4/volumes" Dec 02 21:15:38 crc kubenswrapper[4807]: I1202 21:15:38.972706 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:15:39 crc kubenswrapper[4807]: I1202 21:15:39.712390 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"d6f68288f7acfef5f740d5758d09b3f7845c56b5404e3aaf7c01570e324e2242"} Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.376188 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tqjp4"] Dec 02 21:17:51 crc kubenswrapper[4807]: E1202 21:17:51.377336 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4" containerName="registry-server" Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.377356 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4" containerName="registry-server" Dec 02 21:17:51 crc kubenswrapper[4807]: E1202 21:17:51.377371 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4" containerName="extract-content" Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.377380 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4" containerName="extract-content" Dec 02 21:17:51 crc kubenswrapper[4807]: E1202 21:17:51.377420 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4" containerName="extract-utilities" Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.377430 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4" containerName="extract-utilities" Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.377648 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1cb0e71-c9f7-47d7-bece-57b7f13bc6e4" containerName="registry-server" Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.386255 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.395563 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tqjp4"] Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.451327 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxkfx\" (UniqueName: \"kubernetes.io/projected/b34b3921-496b-4a98-836e-9a3dc7690f0e-kube-api-access-jxkfx\") pod \"redhat-operators-tqjp4\" (UID: \"b34b3921-496b-4a98-836e-9a3dc7690f0e\") " pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.451401 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b34b3921-496b-4a98-836e-9a3dc7690f0e-utilities\") pod \"redhat-operators-tqjp4\" (UID: \"b34b3921-496b-4a98-836e-9a3dc7690f0e\") " pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.451535 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b34b3921-496b-4a98-836e-9a3dc7690f0e-catalog-content\") pod \"redhat-operators-tqjp4\" (UID: \"b34b3921-496b-4a98-836e-9a3dc7690f0e\") " pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.552567 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxkfx\" (UniqueName: \"kubernetes.io/projected/b34b3921-496b-4a98-836e-9a3dc7690f0e-kube-api-access-jxkfx\") pod \"redhat-operators-tqjp4\" (UID: \"b34b3921-496b-4a98-836e-9a3dc7690f0e\") " pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.552616 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b34b3921-496b-4a98-836e-9a3dc7690f0e-utilities\") pod \"redhat-operators-tqjp4\" (UID: \"b34b3921-496b-4a98-836e-9a3dc7690f0e\") " pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.552698 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b34b3921-496b-4a98-836e-9a3dc7690f0e-catalog-content\") pod \"redhat-operators-tqjp4\" (UID: \"b34b3921-496b-4a98-836e-9a3dc7690f0e\") " pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.553212 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b34b3921-496b-4a98-836e-9a3dc7690f0e-catalog-content\") pod \"redhat-operators-tqjp4\" (UID: \"b34b3921-496b-4a98-836e-9a3dc7690f0e\") " pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.553209 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b34b3921-496b-4a98-836e-9a3dc7690f0e-utilities\") pod \"redhat-operators-tqjp4\" (UID: \"b34b3921-496b-4a98-836e-9a3dc7690f0e\") " pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.569703 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxkfx\" (UniqueName: \"kubernetes.io/projected/b34b3921-496b-4a98-836e-9a3dc7690f0e-kube-api-access-jxkfx\") pod \"redhat-operators-tqjp4\" (UID: \"b34b3921-496b-4a98-836e-9a3dc7690f0e\") " pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:17:51 crc kubenswrapper[4807]: I1202 21:17:51.726666 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:17:52 crc kubenswrapper[4807]: I1202 21:17:52.352282 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tqjp4"] Dec 02 21:17:53 crc kubenswrapper[4807]: I1202 21:17:53.309483 4807 generic.go:334] "Generic (PLEG): container finished" podID="b34b3921-496b-4a98-836e-9a3dc7690f0e" containerID="19c30f25601c687cac3d274b2c50f8200714f878d2d0aaafa241251710b97090" exitCode=0 Dec 02 21:17:53 crc kubenswrapper[4807]: I1202 21:17:53.309548 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqjp4" event={"ID":"b34b3921-496b-4a98-836e-9a3dc7690f0e","Type":"ContainerDied","Data":"19c30f25601c687cac3d274b2c50f8200714f878d2d0aaafa241251710b97090"} Dec 02 21:17:53 crc kubenswrapper[4807]: I1202 21:17:53.309792 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqjp4" event={"ID":"b34b3921-496b-4a98-836e-9a3dc7690f0e","Type":"ContainerStarted","Data":"be8904d8b13495d0611a88423bcdb7acf2c401915f1650c6f900cd66f5f9fb21"} Dec 02 21:17:55 crc kubenswrapper[4807]: I1202 21:17:55.334367 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqjp4" event={"ID":"b34b3921-496b-4a98-836e-9a3dc7690f0e","Type":"ContainerStarted","Data":"6d6162738b418fd308f1b778d1b63bef16be53fc83f069ca3b1d6233c96112c8"} Dec 02 21:17:57 crc kubenswrapper[4807]: I1202 21:17:57.361294 4807 generic.go:334] "Generic (PLEG): container finished" podID="b34b3921-496b-4a98-836e-9a3dc7690f0e" containerID="6d6162738b418fd308f1b778d1b63bef16be53fc83f069ca3b1d6233c96112c8" exitCode=0 Dec 02 21:17:57 crc kubenswrapper[4807]: I1202 21:17:57.361632 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqjp4" event={"ID":"b34b3921-496b-4a98-836e-9a3dc7690f0e","Type":"ContainerDied","Data":"6d6162738b418fd308f1b778d1b63bef16be53fc83f069ca3b1d6233c96112c8"} Dec 02 21:17:58 crc kubenswrapper[4807]: I1202 21:17:58.292848 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:17:58 crc kubenswrapper[4807]: I1202 21:17:58.293344 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:17:58 crc kubenswrapper[4807]: I1202 21:17:58.372869 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqjp4" event={"ID":"b34b3921-496b-4a98-836e-9a3dc7690f0e","Type":"ContainerStarted","Data":"768236bdc21ef1199158ab919fc7c6498abc8c60c831bc567aa0e0188c1fdeef"} Dec 02 21:17:58 crc kubenswrapper[4807]: I1202 21:17:58.391844 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tqjp4" podStartSLOduration=2.696108844 podStartE2EDuration="7.39181779s" podCreationTimestamp="2025-12-02 21:17:51 +0000 UTC" firstStartedPulling="2025-12-02 21:17:53.313275958 +0000 UTC m=+4808.614183453" lastFinishedPulling="2025-12-02 21:17:58.008984874 +0000 UTC m=+4813.309892399" observedRunningTime="2025-12-02 21:17:58.390218605 +0000 UTC m=+4813.691126120" watchObservedRunningTime="2025-12-02 21:17:58.39181779 +0000 UTC m=+4813.692725425" Dec 02 21:18:01 crc kubenswrapper[4807]: I1202 21:18:01.726930 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:18:01 crc kubenswrapper[4807]: I1202 21:18:01.727747 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:18:02 crc kubenswrapper[4807]: I1202 21:18:02.787389 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tqjp4" podUID="b34b3921-496b-4a98-836e-9a3dc7690f0e" containerName="registry-server" probeResult="failure" output=< Dec 02 21:18:02 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Dec 02 21:18:02 crc kubenswrapper[4807]: > Dec 02 21:18:06 crc kubenswrapper[4807]: I1202 21:18:06.587692 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dshp7"] Dec 02 21:18:06 crc kubenswrapper[4807]: I1202 21:18:06.591065 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:06 crc kubenswrapper[4807]: I1202 21:18:06.605502 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dshp7"] Dec 02 21:18:06 crc kubenswrapper[4807]: I1202 21:18:06.635509 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd67e634-b4f0-45fe-be22-316715241827-catalog-content\") pod \"redhat-marketplace-dshp7\" (UID: \"dd67e634-b4f0-45fe-be22-316715241827\") " pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:06 crc kubenswrapper[4807]: I1202 21:18:06.636023 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd67e634-b4f0-45fe-be22-316715241827-utilities\") pod \"redhat-marketplace-dshp7\" (UID: \"dd67e634-b4f0-45fe-be22-316715241827\") " pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:06 crc kubenswrapper[4807]: I1202 21:18:06.636238 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhnb2\" (UniqueName: \"kubernetes.io/projected/dd67e634-b4f0-45fe-be22-316715241827-kube-api-access-vhnb2\") pod \"redhat-marketplace-dshp7\" (UID: \"dd67e634-b4f0-45fe-be22-316715241827\") " pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:06 crc kubenswrapper[4807]: I1202 21:18:06.737621 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd67e634-b4f0-45fe-be22-316715241827-utilities\") pod \"redhat-marketplace-dshp7\" (UID: \"dd67e634-b4f0-45fe-be22-316715241827\") " pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:06 crc kubenswrapper[4807]: I1202 21:18:06.737704 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhnb2\" (UniqueName: \"kubernetes.io/projected/dd67e634-b4f0-45fe-be22-316715241827-kube-api-access-vhnb2\") pod \"redhat-marketplace-dshp7\" (UID: \"dd67e634-b4f0-45fe-be22-316715241827\") " pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:06 crc kubenswrapper[4807]: I1202 21:18:06.738250 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd67e634-b4f0-45fe-be22-316715241827-utilities\") pod \"redhat-marketplace-dshp7\" (UID: \"dd67e634-b4f0-45fe-be22-316715241827\") " pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:06 crc kubenswrapper[4807]: I1202 21:18:06.739697 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd67e634-b4f0-45fe-be22-316715241827-catalog-content\") pod \"redhat-marketplace-dshp7\" (UID: \"dd67e634-b4f0-45fe-be22-316715241827\") " pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:06 crc kubenswrapper[4807]: I1202 21:18:06.740100 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd67e634-b4f0-45fe-be22-316715241827-catalog-content\") pod \"redhat-marketplace-dshp7\" (UID: \"dd67e634-b4f0-45fe-be22-316715241827\") " pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:06 crc kubenswrapper[4807]: I1202 21:18:06.778513 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhnb2\" (UniqueName: \"kubernetes.io/projected/dd67e634-b4f0-45fe-be22-316715241827-kube-api-access-vhnb2\") pod \"redhat-marketplace-dshp7\" (UID: \"dd67e634-b4f0-45fe-be22-316715241827\") " pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:06 crc kubenswrapper[4807]: I1202 21:18:06.915253 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:07 crc kubenswrapper[4807]: I1202 21:18:07.788422 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="df8d83f3-6675-416b-a039-2aafac45fe18" containerName="galera" probeResult="failure" output="command timed out" Dec 02 21:18:08 crc kubenswrapper[4807]: I1202 21:18:08.056689 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dshp7"] Dec 02 21:18:08 crc kubenswrapper[4807]: I1202 21:18:08.475768 4807 generic.go:334] "Generic (PLEG): container finished" podID="dd67e634-b4f0-45fe-be22-316715241827" containerID="bb16228776f4fdd88073881b2c079109ad28021273eda9bbac14f487d1ff426b" exitCode=0 Dec 02 21:18:08 crc kubenswrapper[4807]: I1202 21:18:08.475908 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshp7" event={"ID":"dd67e634-b4f0-45fe-be22-316715241827","Type":"ContainerDied","Data":"bb16228776f4fdd88073881b2c079109ad28021273eda9bbac14f487d1ff426b"} Dec 02 21:18:08 crc kubenswrapper[4807]: I1202 21:18:08.476401 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshp7" event={"ID":"dd67e634-b4f0-45fe-be22-316715241827","Type":"ContainerStarted","Data":"efafd40941c8cb334e27cabc3eee319c266cc94a1bc9021a3af368715696ff42"} Dec 02 21:18:10 crc kubenswrapper[4807]: I1202 21:18:10.499441 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshp7" event={"ID":"dd67e634-b4f0-45fe-be22-316715241827","Type":"ContainerStarted","Data":"2b354446068e91959d6cfcddfb7570694b362116e3be17306607e61e0e598559"} Dec 02 21:18:11 crc kubenswrapper[4807]: I1202 21:18:11.513786 4807 generic.go:334] "Generic (PLEG): container finished" podID="dd67e634-b4f0-45fe-be22-316715241827" containerID="2b354446068e91959d6cfcddfb7570694b362116e3be17306607e61e0e598559" exitCode=0 Dec 02 21:18:11 crc kubenswrapper[4807]: I1202 21:18:11.513867 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshp7" event={"ID":"dd67e634-b4f0-45fe-be22-316715241827","Type":"ContainerDied","Data":"2b354446068e91959d6cfcddfb7570694b362116e3be17306607e61e0e598559"} Dec 02 21:18:11 crc kubenswrapper[4807]: I1202 21:18:11.819875 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:18:11 crc kubenswrapper[4807]: I1202 21:18:11.901550 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:18:12 crc kubenswrapper[4807]: I1202 21:18:12.526164 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshp7" event={"ID":"dd67e634-b4f0-45fe-be22-316715241827","Type":"ContainerStarted","Data":"45be1bf1876923646fa8f81ba196b157847ee7202c52f55b0c4973e45ea22b58"} Dec 02 21:18:12 crc kubenswrapper[4807]: I1202 21:18:12.556244 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dshp7" podStartSLOduration=3.060052498 podStartE2EDuration="6.556223749s" podCreationTimestamp="2025-12-02 21:18:06 +0000 UTC" firstStartedPulling="2025-12-02 21:18:08.479077186 +0000 UTC m=+4823.779984681" lastFinishedPulling="2025-12-02 21:18:11.975248427 +0000 UTC m=+4827.276155932" observedRunningTime="2025-12-02 21:18:12.551815895 +0000 UTC m=+4827.852723390" watchObservedRunningTime="2025-12-02 21:18:12.556223749 +0000 UTC m=+4827.857131244" Dec 02 21:18:13 crc kubenswrapper[4807]: I1202 21:18:13.153644 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tqjp4"] Dec 02 21:18:13 crc kubenswrapper[4807]: I1202 21:18:13.536139 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tqjp4" podUID="b34b3921-496b-4a98-836e-9a3dc7690f0e" containerName="registry-server" containerID="cri-o://768236bdc21ef1199158ab919fc7c6498abc8c60c831bc567aa0e0188c1fdeef" gracePeriod=2 Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.091983 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.274960 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b34b3921-496b-4a98-836e-9a3dc7690f0e-utilities\") pod \"b34b3921-496b-4a98-836e-9a3dc7690f0e\" (UID: \"b34b3921-496b-4a98-836e-9a3dc7690f0e\") " Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.275480 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b34b3921-496b-4a98-836e-9a3dc7690f0e-catalog-content\") pod \"b34b3921-496b-4a98-836e-9a3dc7690f0e\" (UID: \"b34b3921-496b-4a98-836e-9a3dc7690f0e\") " Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.275585 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxkfx\" (UniqueName: \"kubernetes.io/projected/b34b3921-496b-4a98-836e-9a3dc7690f0e-kube-api-access-jxkfx\") pod \"b34b3921-496b-4a98-836e-9a3dc7690f0e\" (UID: \"b34b3921-496b-4a98-836e-9a3dc7690f0e\") " Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.275770 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b34b3921-496b-4a98-836e-9a3dc7690f0e-utilities" (OuterVolumeSpecName: "utilities") pod "b34b3921-496b-4a98-836e-9a3dc7690f0e" (UID: "b34b3921-496b-4a98-836e-9a3dc7690f0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.276176 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b34b3921-496b-4a98-836e-9a3dc7690f0e-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.285936 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b34b3921-496b-4a98-836e-9a3dc7690f0e-kube-api-access-jxkfx" (OuterVolumeSpecName: "kube-api-access-jxkfx") pod "b34b3921-496b-4a98-836e-9a3dc7690f0e" (UID: "b34b3921-496b-4a98-836e-9a3dc7690f0e"). InnerVolumeSpecName "kube-api-access-jxkfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.378171 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxkfx\" (UniqueName: \"kubernetes.io/projected/b34b3921-496b-4a98-836e-9a3dc7690f0e-kube-api-access-jxkfx\") on node \"crc\" DevicePath \"\"" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.431313 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b34b3921-496b-4a98-836e-9a3dc7690f0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b34b3921-496b-4a98-836e-9a3dc7690f0e" (UID: "b34b3921-496b-4a98-836e-9a3dc7690f0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.480870 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b34b3921-496b-4a98-836e-9a3dc7690f0e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.555616 4807 generic.go:334] "Generic (PLEG): container finished" podID="b34b3921-496b-4a98-836e-9a3dc7690f0e" containerID="768236bdc21ef1199158ab919fc7c6498abc8c60c831bc567aa0e0188c1fdeef" exitCode=0 Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.555677 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqjp4" event={"ID":"b34b3921-496b-4a98-836e-9a3dc7690f0e","Type":"ContainerDied","Data":"768236bdc21ef1199158ab919fc7c6498abc8c60c831bc567aa0e0188c1fdeef"} Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.555741 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tqjp4" event={"ID":"b34b3921-496b-4a98-836e-9a3dc7690f0e","Type":"ContainerDied","Data":"be8904d8b13495d0611a88423bcdb7acf2c401915f1650c6f900cd66f5f9fb21"} Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.555747 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tqjp4" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.555773 4807 scope.go:117] "RemoveContainer" containerID="768236bdc21ef1199158ab919fc7c6498abc8c60c831bc567aa0e0188c1fdeef" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.605899 4807 scope.go:117] "RemoveContainer" containerID="6d6162738b418fd308f1b778d1b63bef16be53fc83f069ca3b1d6233c96112c8" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.656777 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tqjp4"] Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.666917 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tqjp4"] Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.676520 4807 scope.go:117] "RemoveContainer" containerID="19c30f25601c687cac3d274b2c50f8200714f878d2d0aaafa241251710b97090" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.724907 4807 scope.go:117] "RemoveContainer" containerID="768236bdc21ef1199158ab919fc7c6498abc8c60c831bc567aa0e0188c1fdeef" Dec 02 21:18:14 crc kubenswrapper[4807]: E1202 21:18:14.728881 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"768236bdc21ef1199158ab919fc7c6498abc8c60c831bc567aa0e0188c1fdeef\": container with ID starting with 768236bdc21ef1199158ab919fc7c6498abc8c60c831bc567aa0e0188c1fdeef not found: ID does not exist" containerID="768236bdc21ef1199158ab919fc7c6498abc8c60c831bc567aa0e0188c1fdeef" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.728929 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"768236bdc21ef1199158ab919fc7c6498abc8c60c831bc567aa0e0188c1fdeef"} err="failed to get container status \"768236bdc21ef1199158ab919fc7c6498abc8c60c831bc567aa0e0188c1fdeef\": rpc error: code = NotFound desc = could not find container \"768236bdc21ef1199158ab919fc7c6498abc8c60c831bc567aa0e0188c1fdeef\": container with ID starting with 768236bdc21ef1199158ab919fc7c6498abc8c60c831bc567aa0e0188c1fdeef not found: ID does not exist" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.728959 4807 scope.go:117] "RemoveContainer" containerID="6d6162738b418fd308f1b778d1b63bef16be53fc83f069ca3b1d6233c96112c8" Dec 02 21:18:14 crc kubenswrapper[4807]: E1202 21:18:14.730819 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d6162738b418fd308f1b778d1b63bef16be53fc83f069ca3b1d6233c96112c8\": container with ID starting with 6d6162738b418fd308f1b778d1b63bef16be53fc83f069ca3b1d6233c96112c8 not found: ID does not exist" containerID="6d6162738b418fd308f1b778d1b63bef16be53fc83f069ca3b1d6233c96112c8" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.730860 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6162738b418fd308f1b778d1b63bef16be53fc83f069ca3b1d6233c96112c8"} err="failed to get container status \"6d6162738b418fd308f1b778d1b63bef16be53fc83f069ca3b1d6233c96112c8\": rpc error: code = NotFound desc = could not find container \"6d6162738b418fd308f1b778d1b63bef16be53fc83f069ca3b1d6233c96112c8\": container with ID starting with 6d6162738b418fd308f1b778d1b63bef16be53fc83f069ca3b1d6233c96112c8 not found: ID does not exist" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.730877 4807 scope.go:117] "RemoveContainer" containerID="19c30f25601c687cac3d274b2c50f8200714f878d2d0aaafa241251710b97090" Dec 02 21:18:14 crc kubenswrapper[4807]: E1202 21:18:14.733462 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19c30f25601c687cac3d274b2c50f8200714f878d2d0aaafa241251710b97090\": container with ID starting with 19c30f25601c687cac3d274b2c50f8200714f878d2d0aaafa241251710b97090 not found: ID does not exist" containerID="19c30f25601c687cac3d274b2c50f8200714f878d2d0aaafa241251710b97090" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.733493 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19c30f25601c687cac3d274b2c50f8200714f878d2d0aaafa241251710b97090"} err="failed to get container status \"19c30f25601c687cac3d274b2c50f8200714f878d2d0aaafa241251710b97090\": rpc error: code = NotFound desc = could not find container \"19c30f25601c687cac3d274b2c50f8200714f878d2d0aaafa241251710b97090\": container with ID starting with 19c30f25601c687cac3d274b2c50f8200714f878d2d0aaafa241251710b97090 not found: ID does not exist" Dec 02 21:18:14 crc kubenswrapper[4807]: I1202 21:18:14.986383 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b34b3921-496b-4a98-836e-9a3dc7690f0e" path="/var/lib/kubelet/pods/b34b3921-496b-4a98-836e-9a3dc7690f0e/volumes" Dec 02 21:18:16 crc kubenswrapper[4807]: I1202 21:18:16.915583 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:16 crc kubenswrapper[4807]: I1202 21:18:16.916057 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:16 crc kubenswrapper[4807]: I1202 21:18:16.993357 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:17 crc kubenswrapper[4807]: I1202 21:18:17.681069 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:18 crc kubenswrapper[4807]: I1202 21:18:18.155785 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dshp7"] Dec 02 21:18:19 crc kubenswrapper[4807]: I1202 21:18:19.614713 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dshp7" podUID="dd67e634-b4f0-45fe-be22-316715241827" containerName="registry-server" containerID="cri-o://45be1bf1876923646fa8f81ba196b157847ee7202c52f55b0c4973e45ea22b58" gracePeriod=2 Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.148247 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.318276 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhnb2\" (UniqueName: \"kubernetes.io/projected/dd67e634-b4f0-45fe-be22-316715241827-kube-api-access-vhnb2\") pod \"dd67e634-b4f0-45fe-be22-316715241827\" (UID: \"dd67e634-b4f0-45fe-be22-316715241827\") " Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.318368 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd67e634-b4f0-45fe-be22-316715241827-utilities\") pod \"dd67e634-b4f0-45fe-be22-316715241827\" (UID: \"dd67e634-b4f0-45fe-be22-316715241827\") " Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.318449 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd67e634-b4f0-45fe-be22-316715241827-catalog-content\") pod \"dd67e634-b4f0-45fe-be22-316715241827\" (UID: \"dd67e634-b4f0-45fe-be22-316715241827\") " Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.319617 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd67e634-b4f0-45fe-be22-316715241827-utilities" (OuterVolumeSpecName: "utilities") pod "dd67e634-b4f0-45fe-be22-316715241827" (UID: "dd67e634-b4f0-45fe-be22-316715241827"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.337890 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd67e634-b4f0-45fe-be22-316715241827-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd67e634-b4f0-45fe-be22-316715241827" (UID: "dd67e634-b4f0-45fe-be22-316715241827"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.421549 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd67e634-b4f0-45fe-be22-316715241827-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.421589 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd67e634-b4f0-45fe-be22-316715241827-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.625955 4807 generic.go:334] "Generic (PLEG): container finished" podID="dd67e634-b4f0-45fe-be22-316715241827" containerID="45be1bf1876923646fa8f81ba196b157847ee7202c52f55b0c4973e45ea22b58" exitCode=0 Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.626008 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshp7" event={"ID":"dd67e634-b4f0-45fe-be22-316715241827","Type":"ContainerDied","Data":"45be1bf1876923646fa8f81ba196b157847ee7202c52f55b0c4973e45ea22b58"} Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.626044 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dshp7" Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.626071 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dshp7" event={"ID":"dd67e634-b4f0-45fe-be22-316715241827","Type":"ContainerDied","Data":"efafd40941c8cb334e27cabc3eee319c266cc94a1bc9021a3af368715696ff42"} Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.626097 4807 scope.go:117] "RemoveContainer" containerID="45be1bf1876923646fa8f81ba196b157847ee7202c52f55b0c4973e45ea22b58" Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.651531 4807 scope.go:117] "RemoveContainer" containerID="2b354446068e91959d6cfcddfb7570694b362116e3be17306607e61e0e598559" Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.689629 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd67e634-b4f0-45fe-be22-316715241827-kube-api-access-vhnb2" (OuterVolumeSpecName: "kube-api-access-vhnb2") pod "dd67e634-b4f0-45fe-be22-316715241827" (UID: "dd67e634-b4f0-45fe-be22-316715241827"). InnerVolumeSpecName "kube-api-access-vhnb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.705688 4807 scope.go:117] "RemoveContainer" containerID="bb16228776f4fdd88073881b2c079109ad28021273eda9bbac14f487d1ff426b" Dec 02 21:18:20 crc kubenswrapper[4807]: I1202 21:18:20.727672 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhnb2\" (UniqueName: \"kubernetes.io/projected/dd67e634-b4f0-45fe-be22-316715241827-kube-api-access-vhnb2\") on node \"crc\" DevicePath \"\"" Dec 02 21:18:21 crc kubenswrapper[4807]: I1202 21:18:21.012897 4807 scope.go:117] "RemoveContainer" containerID="45be1bf1876923646fa8f81ba196b157847ee7202c52f55b0c4973e45ea22b58" Dec 02 21:18:21 crc kubenswrapper[4807]: E1202 21:18:21.013667 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45be1bf1876923646fa8f81ba196b157847ee7202c52f55b0c4973e45ea22b58\": container with ID starting with 45be1bf1876923646fa8f81ba196b157847ee7202c52f55b0c4973e45ea22b58 not found: ID does not exist" containerID="45be1bf1876923646fa8f81ba196b157847ee7202c52f55b0c4973e45ea22b58" Dec 02 21:18:21 crc kubenswrapper[4807]: I1202 21:18:21.013729 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45be1bf1876923646fa8f81ba196b157847ee7202c52f55b0c4973e45ea22b58"} err="failed to get container status \"45be1bf1876923646fa8f81ba196b157847ee7202c52f55b0c4973e45ea22b58\": rpc error: code = NotFound desc = could not find container \"45be1bf1876923646fa8f81ba196b157847ee7202c52f55b0c4973e45ea22b58\": container with ID starting with 45be1bf1876923646fa8f81ba196b157847ee7202c52f55b0c4973e45ea22b58 not found: ID does not exist" Dec 02 21:18:21 crc kubenswrapper[4807]: I1202 21:18:21.013759 4807 scope.go:117] "RemoveContainer" containerID="2b354446068e91959d6cfcddfb7570694b362116e3be17306607e61e0e598559" Dec 02 21:18:21 crc kubenswrapper[4807]: E1202 21:18:21.014292 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b354446068e91959d6cfcddfb7570694b362116e3be17306607e61e0e598559\": container with ID starting with 2b354446068e91959d6cfcddfb7570694b362116e3be17306607e61e0e598559 not found: ID does not exist" containerID="2b354446068e91959d6cfcddfb7570694b362116e3be17306607e61e0e598559" Dec 02 21:18:21 crc kubenswrapper[4807]: I1202 21:18:21.014356 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b354446068e91959d6cfcddfb7570694b362116e3be17306607e61e0e598559"} err="failed to get container status \"2b354446068e91959d6cfcddfb7570694b362116e3be17306607e61e0e598559\": rpc error: code = NotFound desc = could not find container \"2b354446068e91959d6cfcddfb7570694b362116e3be17306607e61e0e598559\": container with ID starting with 2b354446068e91959d6cfcddfb7570694b362116e3be17306607e61e0e598559 not found: ID does not exist" Dec 02 21:18:21 crc kubenswrapper[4807]: I1202 21:18:21.014377 4807 scope.go:117] "RemoveContainer" containerID="bb16228776f4fdd88073881b2c079109ad28021273eda9bbac14f487d1ff426b" Dec 02 21:18:21 crc kubenswrapper[4807]: E1202 21:18:21.014762 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb16228776f4fdd88073881b2c079109ad28021273eda9bbac14f487d1ff426b\": container with ID starting with bb16228776f4fdd88073881b2c079109ad28021273eda9bbac14f487d1ff426b not found: ID does not exist" containerID="bb16228776f4fdd88073881b2c079109ad28021273eda9bbac14f487d1ff426b" Dec 02 21:18:21 crc kubenswrapper[4807]: I1202 21:18:21.014896 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb16228776f4fdd88073881b2c079109ad28021273eda9bbac14f487d1ff426b"} err="failed to get container status \"bb16228776f4fdd88073881b2c079109ad28021273eda9bbac14f487d1ff426b\": rpc error: code = NotFound desc = could not find container \"bb16228776f4fdd88073881b2c079109ad28021273eda9bbac14f487d1ff426b\": container with ID starting with bb16228776f4fdd88073881b2c079109ad28021273eda9bbac14f487d1ff426b not found: ID does not exist" Dec 02 21:18:21 crc kubenswrapper[4807]: I1202 21:18:21.076791 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dshp7"] Dec 02 21:18:21 crc kubenswrapper[4807]: I1202 21:18:21.085630 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dshp7"] Dec 02 21:18:22 crc kubenswrapper[4807]: I1202 21:18:22.988856 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd67e634-b4f0-45fe-be22-316715241827" path="/var/lib/kubelet/pods/dd67e634-b4f0-45fe-be22-316715241827/volumes" Dec 02 21:18:28 crc kubenswrapper[4807]: I1202 21:18:28.292457 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:18:28 crc kubenswrapper[4807]: I1202 21:18:28.292884 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:18:58 crc kubenswrapper[4807]: I1202 21:18:58.293564 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:18:58 crc kubenswrapper[4807]: I1202 21:18:58.294393 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:18:58 crc kubenswrapper[4807]: I1202 21:18:58.294484 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 21:18:58 crc kubenswrapper[4807]: I1202 21:18:58.295573 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d6f68288f7acfef5f740d5758d09b3f7845c56b5404e3aaf7c01570e324e2242"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 21:18:58 crc kubenswrapper[4807]: I1202 21:18:58.295675 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://d6f68288f7acfef5f740d5758d09b3f7845c56b5404e3aaf7c01570e324e2242" gracePeriod=600 Dec 02 21:18:59 crc kubenswrapper[4807]: I1202 21:18:59.094132 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="d6f68288f7acfef5f740d5758d09b3f7845c56b5404e3aaf7c01570e324e2242" exitCode=0 Dec 02 21:18:59 crc kubenswrapper[4807]: I1202 21:18:59.094216 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"d6f68288f7acfef5f740d5758d09b3f7845c56b5404e3aaf7c01570e324e2242"} Dec 02 21:18:59 crc kubenswrapper[4807]: I1202 21:18:59.094782 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394"} Dec 02 21:18:59 crc kubenswrapper[4807]: I1202 21:18:59.094854 4807 scope.go:117] "RemoveContainer" containerID="c256bdf9cd57a390a7654956be0a8b55b588b13b22bc3aa66cae6c8663d98869" Dec 02 21:20:58 crc kubenswrapper[4807]: I1202 21:20:58.292915 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:20:58 crc kubenswrapper[4807]: I1202 21:20:58.293477 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:21:28 crc kubenswrapper[4807]: I1202 21:21:28.293078 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:21:28 crc kubenswrapper[4807]: I1202 21:21:28.293825 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:21:58 crc kubenswrapper[4807]: I1202 21:21:58.293198 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:21:58 crc kubenswrapper[4807]: I1202 21:21:58.293939 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:21:58 crc kubenswrapper[4807]: I1202 21:21:58.294011 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 21:21:58 crc kubenswrapper[4807]: I1202 21:21:58.295188 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 21:21:58 crc kubenswrapper[4807]: I1202 21:21:58.295294 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" gracePeriod=600 Dec 02 21:21:58 crc kubenswrapper[4807]: E1202 21:21:58.653615 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:21:59 crc kubenswrapper[4807]: I1202 21:21:59.200937 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" exitCode=0 Dec 02 21:21:59 crc kubenswrapper[4807]: I1202 21:21:59.201004 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394"} Dec 02 21:21:59 crc kubenswrapper[4807]: I1202 21:21:59.201056 4807 scope.go:117] "RemoveContainer" containerID="d6f68288f7acfef5f740d5758d09b3f7845c56b5404e3aaf7c01570e324e2242" Dec 02 21:21:59 crc kubenswrapper[4807]: I1202 21:21:59.201963 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:21:59 crc kubenswrapper[4807]: E1202 21:21:59.202497 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:22:12 crc kubenswrapper[4807]: I1202 21:22:12.973022 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:22:12 crc kubenswrapper[4807]: E1202 21:22:12.973787 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:22:27 crc kubenswrapper[4807]: I1202 21:22:27.972664 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:22:27 crc kubenswrapper[4807]: E1202 21:22:27.973982 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:22:38 crc kubenswrapper[4807]: I1202 21:22:38.973463 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:22:38 crc kubenswrapper[4807]: E1202 21:22:38.974176 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:22:51 crc kubenswrapper[4807]: I1202 21:22:51.972143 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:22:51 crc kubenswrapper[4807]: E1202 21:22:51.972912 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.530175 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8cwsn"] Dec 02 21:22:55 crc kubenswrapper[4807]: E1202 21:22:55.533156 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd67e634-b4f0-45fe-be22-316715241827" containerName="registry-server" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.533340 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd67e634-b4f0-45fe-be22-316715241827" containerName="registry-server" Dec 02 21:22:55 crc kubenswrapper[4807]: E1202 21:22:55.533526 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34b3921-496b-4a98-836e-9a3dc7690f0e" containerName="extract-content" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.533655 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34b3921-496b-4a98-836e-9a3dc7690f0e" containerName="extract-content" Dec 02 21:22:55 crc kubenswrapper[4807]: E1202 21:22:55.533829 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd67e634-b4f0-45fe-be22-316715241827" containerName="extract-utilities" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.533954 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd67e634-b4f0-45fe-be22-316715241827" containerName="extract-utilities" Dec 02 21:22:55 crc kubenswrapper[4807]: E1202 21:22:55.534087 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34b3921-496b-4a98-836e-9a3dc7690f0e" containerName="registry-server" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.534205 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34b3921-496b-4a98-836e-9a3dc7690f0e" containerName="registry-server" Dec 02 21:22:55 crc kubenswrapper[4807]: E1202 21:22:55.534346 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34b3921-496b-4a98-836e-9a3dc7690f0e" containerName="extract-utilities" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.534467 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34b3921-496b-4a98-836e-9a3dc7690f0e" containerName="extract-utilities" Dec 02 21:22:55 crc kubenswrapper[4807]: E1202 21:22:55.534618 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd67e634-b4f0-45fe-be22-316715241827" containerName="extract-content" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.534764 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd67e634-b4f0-45fe-be22-316715241827" containerName="extract-content" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.535278 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b34b3921-496b-4a98-836e-9a3dc7690f0e" containerName="registry-server" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.535427 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd67e634-b4f0-45fe-be22-316715241827" containerName="registry-server" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.540805 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.549110 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8cwsn"] Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.696755 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t48dw\" (UniqueName: \"kubernetes.io/projected/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-kube-api-access-t48dw\") pod \"community-operators-8cwsn\" (UID: \"54c7dfe2-a59d-49fe-8024-9eb83960cbc3\") " pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.697192 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-utilities\") pod \"community-operators-8cwsn\" (UID: \"54c7dfe2-a59d-49fe-8024-9eb83960cbc3\") " pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.697548 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-catalog-content\") pod \"community-operators-8cwsn\" (UID: \"54c7dfe2-a59d-49fe-8024-9eb83960cbc3\") " pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.799230 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t48dw\" (UniqueName: \"kubernetes.io/projected/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-kube-api-access-t48dw\") pod \"community-operators-8cwsn\" (UID: \"54c7dfe2-a59d-49fe-8024-9eb83960cbc3\") " pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.799527 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-utilities\") pod \"community-operators-8cwsn\" (UID: \"54c7dfe2-a59d-49fe-8024-9eb83960cbc3\") " pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.799617 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-catalog-content\") pod \"community-operators-8cwsn\" (UID: \"54c7dfe2-a59d-49fe-8024-9eb83960cbc3\") " pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.800155 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-utilities\") pod \"community-operators-8cwsn\" (UID: \"54c7dfe2-a59d-49fe-8024-9eb83960cbc3\") " pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.800184 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-catalog-content\") pod \"community-operators-8cwsn\" (UID: \"54c7dfe2-a59d-49fe-8024-9eb83960cbc3\") " pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.840627 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t48dw\" (UniqueName: \"kubernetes.io/projected/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-kube-api-access-t48dw\") pod \"community-operators-8cwsn\" (UID: \"54c7dfe2-a59d-49fe-8024-9eb83960cbc3\") " pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:22:55 crc kubenswrapper[4807]: I1202 21:22:55.893343 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:22:56 crc kubenswrapper[4807]: I1202 21:22:56.467020 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8cwsn"] Dec 02 21:22:56 crc kubenswrapper[4807]: W1202 21:22:56.476624 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54c7dfe2_a59d_49fe_8024_9eb83960cbc3.slice/crio-097039e0406f34ed4a707cb184d9e48694a140008aeb7b10bfbc0d57df0531e1 WatchSource:0}: Error finding container 097039e0406f34ed4a707cb184d9e48694a140008aeb7b10bfbc0d57df0531e1: Status 404 returned error can't find the container with id 097039e0406f34ed4a707cb184d9e48694a140008aeb7b10bfbc0d57df0531e1 Dec 02 21:22:56 crc kubenswrapper[4807]: I1202 21:22:56.884695 4807 generic.go:334] "Generic (PLEG): container finished" podID="54c7dfe2-a59d-49fe-8024-9eb83960cbc3" containerID="240760e450656a13f8f539a95038999337c86d95115bdb16c67a84b6381991b0" exitCode=0 Dec 02 21:22:56 crc kubenswrapper[4807]: I1202 21:22:56.884759 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cwsn" event={"ID":"54c7dfe2-a59d-49fe-8024-9eb83960cbc3","Type":"ContainerDied","Data":"240760e450656a13f8f539a95038999337c86d95115bdb16c67a84b6381991b0"} Dec 02 21:22:56 crc kubenswrapper[4807]: I1202 21:22:56.885248 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cwsn" event={"ID":"54c7dfe2-a59d-49fe-8024-9eb83960cbc3","Type":"ContainerStarted","Data":"097039e0406f34ed4a707cb184d9e48694a140008aeb7b10bfbc0d57df0531e1"} Dec 02 21:22:56 crc kubenswrapper[4807]: I1202 21:22:56.886674 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 21:22:57 crc kubenswrapper[4807]: I1202 21:22:57.897141 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cwsn" event={"ID":"54c7dfe2-a59d-49fe-8024-9eb83960cbc3","Type":"ContainerStarted","Data":"3c7fcf9dc2689f901aef0cfe6b863ad0cddcca052bd27f44dc52fb9538f02723"} Dec 02 21:22:58 crc kubenswrapper[4807]: I1202 21:22:58.914578 4807 generic.go:334] "Generic (PLEG): container finished" podID="54c7dfe2-a59d-49fe-8024-9eb83960cbc3" containerID="3c7fcf9dc2689f901aef0cfe6b863ad0cddcca052bd27f44dc52fb9538f02723" exitCode=0 Dec 02 21:22:58 crc kubenswrapper[4807]: I1202 21:22:58.915093 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cwsn" event={"ID":"54c7dfe2-a59d-49fe-8024-9eb83960cbc3","Type":"ContainerDied","Data":"3c7fcf9dc2689f901aef0cfe6b863ad0cddcca052bd27f44dc52fb9538f02723"} Dec 02 21:22:59 crc kubenswrapper[4807]: I1202 21:22:59.934483 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cwsn" event={"ID":"54c7dfe2-a59d-49fe-8024-9eb83960cbc3","Type":"ContainerStarted","Data":"f7dc99ed4a5c926aa9ed8696ad1744d9b4b5dddc4f8ea6877ebe81754b16ba95"} Dec 02 21:22:59 crc kubenswrapper[4807]: I1202 21:22:59.960127 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8cwsn" podStartSLOduration=2.420184317 podStartE2EDuration="4.960090941s" podCreationTimestamp="2025-12-02 21:22:55 +0000 UTC" firstStartedPulling="2025-12-02 21:22:56.886354323 +0000 UTC m=+5112.187261838" lastFinishedPulling="2025-12-02 21:22:59.426260967 +0000 UTC m=+5114.727168462" observedRunningTime="2025-12-02 21:22:59.953501255 +0000 UTC m=+5115.254408760" watchObservedRunningTime="2025-12-02 21:22:59.960090941 +0000 UTC m=+5115.260998456" Dec 02 21:23:05 crc kubenswrapper[4807]: I1202 21:23:05.894258 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:23:05 crc kubenswrapper[4807]: I1202 21:23:05.895116 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:23:05 crc kubenswrapper[4807]: I1202 21:23:05.973509 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:23:05 crc kubenswrapper[4807]: E1202 21:23:05.973730 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:23:05 crc kubenswrapper[4807]: I1202 21:23:05.989019 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:23:06 crc kubenswrapper[4807]: I1202 21:23:06.074320 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:23:06 crc kubenswrapper[4807]: I1202 21:23:06.235279 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8cwsn"] Dec 02 21:23:08 crc kubenswrapper[4807]: I1202 21:23:08.019084 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8cwsn" podUID="54c7dfe2-a59d-49fe-8024-9eb83960cbc3" containerName="registry-server" containerID="cri-o://f7dc99ed4a5c926aa9ed8696ad1744d9b4b5dddc4f8ea6877ebe81754b16ba95" gracePeriod=2 Dec 02 21:23:08 crc kubenswrapper[4807]: I1202 21:23:08.567981 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:23:08 crc kubenswrapper[4807]: I1202 21:23:08.688306 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-catalog-content\") pod \"54c7dfe2-a59d-49fe-8024-9eb83960cbc3\" (UID: \"54c7dfe2-a59d-49fe-8024-9eb83960cbc3\") " Dec 02 21:23:08 crc kubenswrapper[4807]: I1202 21:23:08.688517 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t48dw\" (UniqueName: \"kubernetes.io/projected/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-kube-api-access-t48dw\") pod \"54c7dfe2-a59d-49fe-8024-9eb83960cbc3\" (UID: \"54c7dfe2-a59d-49fe-8024-9eb83960cbc3\") " Dec 02 21:23:08 crc kubenswrapper[4807]: I1202 21:23:08.688766 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-utilities\") pod \"54c7dfe2-a59d-49fe-8024-9eb83960cbc3\" (UID: \"54c7dfe2-a59d-49fe-8024-9eb83960cbc3\") " Dec 02 21:23:08 crc kubenswrapper[4807]: I1202 21:23:08.690400 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-utilities" (OuterVolumeSpecName: "utilities") pod "54c7dfe2-a59d-49fe-8024-9eb83960cbc3" (UID: "54c7dfe2-a59d-49fe-8024-9eb83960cbc3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:23:08 crc kubenswrapper[4807]: I1202 21:23:08.694911 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-kube-api-access-t48dw" (OuterVolumeSpecName: "kube-api-access-t48dw") pod "54c7dfe2-a59d-49fe-8024-9eb83960cbc3" (UID: "54c7dfe2-a59d-49fe-8024-9eb83960cbc3"). InnerVolumeSpecName "kube-api-access-t48dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:23:08 crc kubenswrapper[4807]: I1202 21:23:08.741291 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54c7dfe2-a59d-49fe-8024-9eb83960cbc3" (UID: "54c7dfe2-a59d-49fe-8024-9eb83960cbc3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:23:08 crc kubenswrapper[4807]: I1202 21:23:08.792221 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t48dw\" (UniqueName: \"kubernetes.io/projected/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-kube-api-access-t48dw\") on node \"crc\" DevicePath \"\"" Dec 02 21:23:08 crc kubenswrapper[4807]: I1202 21:23:08.792453 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 21:23:08 crc kubenswrapper[4807]: I1202 21:23:08.792481 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54c7dfe2-a59d-49fe-8024-9eb83960cbc3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 21:23:09 crc kubenswrapper[4807]: I1202 21:23:09.036102 4807 generic.go:334] "Generic (PLEG): container finished" podID="54c7dfe2-a59d-49fe-8024-9eb83960cbc3" containerID="f7dc99ed4a5c926aa9ed8696ad1744d9b4b5dddc4f8ea6877ebe81754b16ba95" exitCode=0 Dec 02 21:23:09 crc kubenswrapper[4807]: I1202 21:23:09.036153 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cwsn" event={"ID":"54c7dfe2-a59d-49fe-8024-9eb83960cbc3","Type":"ContainerDied","Data":"f7dc99ed4a5c926aa9ed8696ad1744d9b4b5dddc4f8ea6877ebe81754b16ba95"} Dec 02 21:23:09 crc kubenswrapper[4807]: I1202 21:23:09.037112 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cwsn" event={"ID":"54c7dfe2-a59d-49fe-8024-9eb83960cbc3","Type":"ContainerDied","Data":"097039e0406f34ed4a707cb184d9e48694a140008aeb7b10bfbc0d57df0531e1"} Dec 02 21:23:09 crc kubenswrapper[4807]: I1202 21:23:09.037138 4807 scope.go:117] "RemoveContainer" containerID="f7dc99ed4a5c926aa9ed8696ad1744d9b4b5dddc4f8ea6877ebe81754b16ba95" Dec 02 21:23:09 crc kubenswrapper[4807]: I1202 21:23:09.036235 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cwsn" Dec 02 21:23:09 crc kubenswrapper[4807]: I1202 21:23:09.064568 4807 scope.go:117] "RemoveContainer" containerID="3c7fcf9dc2689f901aef0cfe6b863ad0cddcca052bd27f44dc52fb9538f02723" Dec 02 21:23:09 crc kubenswrapper[4807]: I1202 21:23:09.076677 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8cwsn"] Dec 02 21:23:09 crc kubenswrapper[4807]: I1202 21:23:09.098767 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8cwsn"] Dec 02 21:23:09 crc kubenswrapper[4807]: I1202 21:23:09.100013 4807 scope.go:117] "RemoveContainer" containerID="240760e450656a13f8f539a95038999337c86d95115bdb16c67a84b6381991b0" Dec 02 21:23:09 crc kubenswrapper[4807]: I1202 21:23:09.141337 4807 scope.go:117] "RemoveContainer" containerID="f7dc99ed4a5c926aa9ed8696ad1744d9b4b5dddc4f8ea6877ebe81754b16ba95" Dec 02 21:23:09 crc kubenswrapper[4807]: E1202 21:23:09.142483 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7dc99ed4a5c926aa9ed8696ad1744d9b4b5dddc4f8ea6877ebe81754b16ba95\": container with ID starting with f7dc99ed4a5c926aa9ed8696ad1744d9b4b5dddc4f8ea6877ebe81754b16ba95 not found: ID does not exist" containerID="f7dc99ed4a5c926aa9ed8696ad1744d9b4b5dddc4f8ea6877ebe81754b16ba95" Dec 02 21:23:09 crc kubenswrapper[4807]: I1202 21:23:09.142563 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7dc99ed4a5c926aa9ed8696ad1744d9b4b5dddc4f8ea6877ebe81754b16ba95"} err="failed to get container status \"f7dc99ed4a5c926aa9ed8696ad1744d9b4b5dddc4f8ea6877ebe81754b16ba95\": rpc error: code = NotFound desc = could not find container \"f7dc99ed4a5c926aa9ed8696ad1744d9b4b5dddc4f8ea6877ebe81754b16ba95\": container with ID starting with f7dc99ed4a5c926aa9ed8696ad1744d9b4b5dddc4f8ea6877ebe81754b16ba95 not found: ID does not exist" Dec 02 21:23:09 crc kubenswrapper[4807]: I1202 21:23:09.142625 4807 scope.go:117] "RemoveContainer" containerID="3c7fcf9dc2689f901aef0cfe6b863ad0cddcca052bd27f44dc52fb9538f02723" Dec 02 21:23:09 crc kubenswrapper[4807]: E1202 21:23:09.143445 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c7fcf9dc2689f901aef0cfe6b863ad0cddcca052bd27f44dc52fb9538f02723\": container with ID starting with 3c7fcf9dc2689f901aef0cfe6b863ad0cddcca052bd27f44dc52fb9538f02723 not found: ID does not exist" containerID="3c7fcf9dc2689f901aef0cfe6b863ad0cddcca052bd27f44dc52fb9538f02723" Dec 02 21:23:09 crc kubenswrapper[4807]: I1202 21:23:09.143502 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c7fcf9dc2689f901aef0cfe6b863ad0cddcca052bd27f44dc52fb9538f02723"} err="failed to get container status \"3c7fcf9dc2689f901aef0cfe6b863ad0cddcca052bd27f44dc52fb9538f02723\": rpc error: code = NotFound desc = could not find container \"3c7fcf9dc2689f901aef0cfe6b863ad0cddcca052bd27f44dc52fb9538f02723\": container with ID starting with 3c7fcf9dc2689f901aef0cfe6b863ad0cddcca052bd27f44dc52fb9538f02723 not found: ID does not exist" Dec 02 21:23:09 crc kubenswrapper[4807]: I1202 21:23:09.143522 4807 scope.go:117] "RemoveContainer" containerID="240760e450656a13f8f539a95038999337c86d95115bdb16c67a84b6381991b0" Dec 02 21:23:09 crc kubenswrapper[4807]: E1202 21:23:09.144220 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"240760e450656a13f8f539a95038999337c86d95115bdb16c67a84b6381991b0\": container with ID starting with 240760e450656a13f8f539a95038999337c86d95115bdb16c67a84b6381991b0 not found: ID does not exist" containerID="240760e450656a13f8f539a95038999337c86d95115bdb16c67a84b6381991b0" Dec 02 21:23:09 crc kubenswrapper[4807]: I1202 21:23:09.144274 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"240760e450656a13f8f539a95038999337c86d95115bdb16c67a84b6381991b0"} err="failed to get container status \"240760e450656a13f8f539a95038999337c86d95115bdb16c67a84b6381991b0\": rpc error: code = NotFound desc = could not find container \"240760e450656a13f8f539a95038999337c86d95115bdb16c67a84b6381991b0\": container with ID starting with 240760e450656a13f8f539a95038999337c86d95115bdb16c67a84b6381991b0 not found: ID does not exist" Dec 02 21:23:10 crc kubenswrapper[4807]: I1202 21:23:10.994382 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54c7dfe2-a59d-49fe-8024-9eb83960cbc3" path="/var/lib/kubelet/pods/54c7dfe2-a59d-49fe-8024-9eb83960cbc3/volumes" Dec 02 21:23:18 crc kubenswrapper[4807]: I1202 21:23:18.972419 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:23:18 crc kubenswrapper[4807]: E1202 21:23:18.973677 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:23:31 crc kubenswrapper[4807]: I1202 21:23:31.293744 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:23:31 crc kubenswrapper[4807]: E1202 21:23:31.297493 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:23:41 crc kubenswrapper[4807]: I1202 21:23:41.973455 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:23:41 crc kubenswrapper[4807]: E1202 21:23:41.974772 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:23:52 crc kubenswrapper[4807]: I1202 21:23:52.974092 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:23:52 crc kubenswrapper[4807]: E1202 21:23:52.976224 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:24:06 crc kubenswrapper[4807]: I1202 21:24:06.973548 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:24:06 crc kubenswrapper[4807]: E1202 21:24:06.974793 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:24:20 crc kubenswrapper[4807]: I1202 21:24:20.972541 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:24:20 crc kubenswrapper[4807]: E1202 21:24:20.973430 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:24:35 crc kubenswrapper[4807]: I1202 21:24:35.972516 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:24:35 crc kubenswrapper[4807]: E1202 21:24:35.973783 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:24:48 crc kubenswrapper[4807]: I1202 21:24:48.973058 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:24:48 crc kubenswrapper[4807]: E1202 21:24:48.974254 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:24:56 crc kubenswrapper[4807]: I1202 21:24:56.458680 4807 generic.go:334] "Generic (PLEG): container finished" podID="e6826607-5100-439e-b82d-224b312a6faa" containerID="1737b084f5f6e04b118288204293c483cd4d470b8df535f2e80004e54efd3ea1" exitCode=0 Dec 02 21:24:56 crc kubenswrapper[4807]: I1202 21:24:56.458869 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e6826607-5100-439e-b82d-224b312a6faa","Type":"ContainerDied","Data":"1737b084f5f6e04b118288204293c483cd4d470b8df535f2e80004e54efd3ea1"} Dec 02 21:24:57 crc kubenswrapper[4807]: I1202 21:24:57.955666 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 21:24:57 crc kubenswrapper[4807]: I1202 21:24:57.994850 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-ssh-key\") pod \"e6826607-5100-439e-b82d-224b312a6faa\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " Dec 02 21:24:57 crc kubenswrapper[4807]: I1202 21:24:57.995065 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"e6826607-5100-439e-b82d-224b312a6faa\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " Dec 02 21:24:57 crc kubenswrapper[4807]: I1202 21:24:57.995193 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-ca-certs\") pod \"e6826607-5100-439e-b82d-224b312a6faa\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " Dec 02 21:24:57 crc kubenswrapper[4807]: I1202 21:24:57.995254 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6826607-5100-439e-b82d-224b312a6faa-config-data\") pod \"e6826607-5100-439e-b82d-224b312a6faa\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " Dec 02 21:24:57 crc kubenswrapper[4807]: I1202 21:24:57.995314 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqxx5\" (UniqueName: \"kubernetes.io/projected/e6826607-5100-439e-b82d-224b312a6faa-kube-api-access-tqxx5\") pod \"e6826607-5100-439e-b82d-224b312a6faa\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " Dec 02 21:24:57 crc kubenswrapper[4807]: I1202 21:24:57.995352 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e6826607-5100-439e-b82d-224b312a6faa-test-operator-ephemeral-temporary\") pod \"e6826607-5100-439e-b82d-224b312a6faa\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " Dec 02 21:24:57 crc kubenswrapper[4807]: I1202 21:24:57.995423 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e6826607-5100-439e-b82d-224b312a6faa-openstack-config\") pod \"e6826607-5100-439e-b82d-224b312a6faa\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " Dec 02 21:24:57 crc kubenswrapper[4807]: I1202 21:24:57.995506 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-openstack-config-secret\") pod \"e6826607-5100-439e-b82d-224b312a6faa\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " Dec 02 21:24:57 crc kubenswrapper[4807]: I1202 21:24:57.995570 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e6826607-5100-439e-b82d-224b312a6faa-test-operator-ephemeral-workdir\") pod \"e6826607-5100-439e-b82d-224b312a6faa\" (UID: \"e6826607-5100-439e-b82d-224b312a6faa\") " Dec 02 21:24:57 crc kubenswrapper[4807]: I1202 21:24:57.996802 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6826607-5100-439e-b82d-224b312a6faa-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "e6826607-5100-439e-b82d-224b312a6faa" (UID: "e6826607-5100-439e-b82d-224b312a6faa"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:24:57 crc kubenswrapper[4807]: I1202 21:24:57.997824 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6826607-5100-439e-b82d-224b312a6faa-config-data" (OuterVolumeSpecName: "config-data") pod "e6826607-5100-439e-b82d-224b312a6faa" (UID: "e6826607-5100-439e-b82d-224b312a6faa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.010946 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "test-operator-logs") pod "e6826607-5100-439e-b82d-224b312a6faa" (UID: "e6826607-5100-439e-b82d-224b312a6faa"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.024275 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6826607-5100-439e-b82d-224b312a6faa-kube-api-access-tqxx5" (OuterVolumeSpecName: "kube-api-access-tqxx5") pod "e6826607-5100-439e-b82d-224b312a6faa" (UID: "e6826607-5100-439e-b82d-224b312a6faa"). InnerVolumeSpecName "kube-api-access-tqxx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.053470 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "e6826607-5100-439e-b82d-224b312a6faa" (UID: "e6826607-5100-439e-b82d-224b312a6faa"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.056266 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e6826607-5100-439e-b82d-224b312a6faa" (UID: "e6826607-5100-439e-b82d-224b312a6faa"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.059330 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e6826607-5100-439e-b82d-224b312a6faa" (UID: "e6826607-5100-439e-b82d-224b312a6faa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.098776 4807 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.098882 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6826607-5100-439e-b82d-224b312a6faa-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.098922 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqxx5\" (UniqueName: \"kubernetes.io/projected/e6826607-5100-439e-b82d-224b312a6faa-kube-api-access-tqxx5\") on node \"crc\" DevicePath \"\"" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.098936 4807 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e6826607-5100-439e-b82d-224b312a6faa-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.098948 4807 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.098962 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6826607-5100-439e-b82d-224b312a6faa-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.099022 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.099747 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6826607-5100-439e-b82d-224b312a6faa-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e6826607-5100-439e-b82d-224b312a6faa" (UID: "e6826607-5100-439e-b82d-224b312a6faa"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.107187 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6826607-5100-439e-b82d-224b312a6faa-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "e6826607-5100-439e-b82d-224b312a6faa" (UID: "e6826607-5100-439e-b82d-224b312a6faa"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.128268 4807 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.201149 4807 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e6826607-5100-439e-b82d-224b312a6faa-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.201192 4807 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e6826607-5100-439e-b82d-224b312a6faa-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.201212 4807 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.488234 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e6826607-5100-439e-b82d-224b312a6faa","Type":"ContainerDied","Data":"f4e6fb4235efc99236d6accf9b3a178915df5202af061ae369943751065a305d"} Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.488301 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4e6fb4235efc99236d6accf9b3a178915df5202af061ae369943751065a305d" Dec 02 21:24:58 crc kubenswrapper[4807]: I1202 21:24:58.488352 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 21:24:59 crc kubenswrapper[4807]: I1202 21:24:59.972353 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:24:59 crc kubenswrapper[4807]: E1202 21:24:59.973004 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:25:08 crc kubenswrapper[4807]: I1202 21:25:08.795185 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 02 21:25:08 crc kubenswrapper[4807]: E1202 21:25:08.799977 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c7dfe2-a59d-49fe-8024-9eb83960cbc3" containerName="extract-content" Dec 02 21:25:08 crc kubenswrapper[4807]: I1202 21:25:08.800017 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c7dfe2-a59d-49fe-8024-9eb83960cbc3" containerName="extract-content" Dec 02 21:25:08 crc kubenswrapper[4807]: E1202 21:25:08.800048 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c7dfe2-a59d-49fe-8024-9eb83960cbc3" containerName="extract-utilities" Dec 02 21:25:08 crc kubenswrapper[4807]: I1202 21:25:08.800060 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c7dfe2-a59d-49fe-8024-9eb83960cbc3" containerName="extract-utilities" Dec 02 21:25:08 crc kubenswrapper[4807]: E1202 21:25:08.800097 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6826607-5100-439e-b82d-224b312a6faa" containerName="tempest-tests-tempest-tests-runner" Dec 02 21:25:08 crc kubenswrapper[4807]: I1202 21:25:08.800109 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6826607-5100-439e-b82d-224b312a6faa" containerName="tempest-tests-tempest-tests-runner" Dec 02 21:25:08 crc kubenswrapper[4807]: E1202 21:25:08.800138 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c7dfe2-a59d-49fe-8024-9eb83960cbc3" containerName="registry-server" Dec 02 21:25:08 crc kubenswrapper[4807]: I1202 21:25:08.800149 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c7dfe2-a59d-49fe-8024-9eb83960cbc3" containerName="registry-server" Dec 02 21:25:08 crc kubenswrapper[4807]: I1202 21:25:08.800453 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6826607-5100-439e-b82d-224b312a6faa" containerName="tempest-tests-tempest-tests-runner" Dec 02 21:25:08 crc kubenswrapper[4807]: I1202 21:25:08.800487 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="54c7dfe2-a59d-49fe-8024-9eb83960cbc3" containerName="registry-server" Dec 02 21:25:08 crc kubenswrapper[4807]: I1202 21:25:08.801609 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 21:25:08 crc kubenswrapper[4807]: I1202 21:25:08.804869 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6wh86" Dec 02 21:25:08 crc kubenswrapper[4807]: I1202 21:25:08.834981 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 02 21:25:08 crc kubenswrapper[4807]: I1202 21:25:08.961840 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cth7h\" (UniqueName: \"kubernetes.io/projected/f0d7af3c-3d32-47d8-ab5b-3293bd5e26eb-kube-api-access-cth7h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f0d7af3c-3d32-47d8-ab5b-3293bd5e26eb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 21:25:08 crc kubenswrapper[4807]: I1202 21:25:08.962580 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f0d7af3c-3d32-47d8-ab5b-3293bd5e26eb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 21:25:09 crc kubenswrapper[4807]: I1202 21:25:09.066659 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cth7h\" (UniqueName: \"kubernetes.io/projected/f0d7af3c-3d32-47d8-ab5b-3293bd5e26eb-kube-api-access-cth7h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f0d7af3c-3d32-47d8-ab5b-3293bd5e26eb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 21:25:09 crc kubenswrapper[4807]: I1202 21:25:09.066801 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f0d7af3c-3d32-47d8-ab5b-3293bd5e26eb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 21:25:09 crc kubenswrapper[4807]: I1202 21:25:09.067282 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f0d7af3c-3d32-47d8-ab5b-3293bd5e26eb\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 21:25:09 crc kubenswrapper[4807]: I1202 21:25:09.091569 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cth7h\" (UniqueName: \"kubernetes.io/projected/f0d7af3c-3d32-47d8-ab5b-3293bd5e26eb-kube-api-access-cth7h\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f0d7af3c-3d32-47d8-ab5b-3293bd5e26eb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 21:25:09 crc kubenswrapper[4807]: I1202 21:25:09.102103 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"f0d7af3c-3d32-47d8-ab5b-3293bd5e26eb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 21:25:09 crc kubenswrapper[4807]: I1202 21:25:09.145168 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 21:25:09 crc kubenswrapper[4807]: I1202 21:25:09.644666 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 02 21:25:10 crc kubenswrapper[4807]: I1202 21:25:10.620145 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"f0d7af3c-3d32-47d8-ab5b-3293bd5e26eb","Type":"ContainerStarted","Data":"9fad362f9e5f079c57fd423287f54e795e161154d2edfd31d793f41f4ec2760d"} Dec 02 21:25:11 crc kubenswrapper[4807]: I1202 21:25:11.631515 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"f0d7af3c-3d32-47d8-ab5b-3293bd5e26eb","Type":"ContainerStarted","Data":"c49ea17c0b8a80fc99b94c70c983fbac945be746ca2dd3d707f3cc6dba1970b3"} Dec 02 21:25:11 crc kubenswrapper[4807]: I1202 21:25:11.645297 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.643720327 podStartE2EDuration="3.645266126s" podCreationTimestamp="2025-12-02 21:25:08 +0000 UTC" firstStartedPulling="2025-12-02 21:25:09.650771766 +0000 UTC m=+5244.951679261" lastFinishedPulling="2025-12-02 21:25:10.652317565 +0000 UTC m=+5245.953225060" observedRunningTime="2025-12-02 21:25:11.645171764 +0000 UTC m=+5246.946079279" watchObservedRunningTime="2025-12-02 21:25:11.645266126 +0000 UTC m=+5246.946173621" Dec 02 21:25:13 crc kubenswrapper[4807]: I1202 21:25:13.972557 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:25:13 crc kubenswrapper[4807]: E1202 21:25:13.973812 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:25:27 crc kubenswrapper[4807]: I1202 21:25:27.974065 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:25:27 crc kubenswrapper[4807]: E1202 21:25:27.974857 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:25:34 crc kubenswrapper[4807]: I1202 21:25:34.578153 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7hxrb/must-gather-lx7rf"] Dec 02 21:25:34 crc kubenswrapper[4807]: I1202 21:25:34.580375 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hxrb/must-gather-lx7rf" Dec 02 21:25:34 crc kubenswrapper[4807]: I1202 21:25:34.582281 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7hxrb"/"default-dockercfg-ftk75" Dec 02 21:25:34 crc kubenswrapper[4807]: I1202 21:25:34.583161 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7hxrb"/"kube-root-ca.crt" Dec 02 21:25:34 crc kubenswrapper[4807]: I1202 21:25:34.584927 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7hxrb"/"openshift-service-ca.crt" Dec 02 21:25:34 crc kubenswrapper[4807]: I1202 21:25:34.589330 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7hxrb/must-gather-lx7rf"] Dec 02 21:25:34 crc kubenswrapper[4807]: I1202 21:25:34.701934 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8-must-gather-output\") pod \"must-gather-lx7rf\" (UID: \"c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8\") " pod="openshift-must-gather-7hxrb/must-gather-lx7rf" Dec 02 21:25:34 crc kubenswrapper[4807]: I1202 21:25:34.702020 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpsgd\" (UniqueName: \"kubernetes.io/projected/c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8-kube-api-access-zpsgd\") pod \"must-gather-lx7rf\" (UID: \"c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8\") " pod="openshift-must-gather-7hxrb/must-gather-lx7rf" Dec 02 21:25:34 crc kubenswrapper[4807]: I1202 21:25:34.804883 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8-must-gather-output\") pod \"must-gather-lx7rf\" (UID: \"c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8\") " pod="openshift-must-gather-7hxrb/must-gather-lx7rf" Dec 02 21:25:34 crc kubenswrapper[4807]: I1202 21:25:34.804955 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpsgd\" (UniqueName: \"kubernetes.io/projected/c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8-kube-api-access-zpsgd\") pod \"must-gather-lx7rf\" (UID: \"c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8\") " pod="openshift-must-gather-7hxrb/must-gather-lx7rf" Dec 02 21:25:34 crc kubenswrapper[4807]: I1202 21:25:34.805458 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8-must-gather-output\") pod \"must-gather-lx7rf\" (UID: \"c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8\") " pod="openshift-must-gather-7hxrb/must-gather-lx7rf" Dec 02 21:25:34 crc kubenswrapper[4807]: I1202 21:25:34.827277 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpsgd\" (UniqueName: \"kubernetes.io/projected/c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8-kube-api-access-zpsgd\") pod \"must-gather-lx7rf\" (UID: \"c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8\") " pod="openshift-must-gather-7hxrb/must-gather-lx7rf" Dec 02 21:25:34 crc kubenswrapper[4807]: I1202 21:25:34.901336 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hxrb/must-gather-lx7rf" Dec 02 21:25:35 crc kubenswrapper[4807]: I1202 21:25:35.399556 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7hxrb/must-gather-lx7rf"] Dec 02 21:25:35 crc kubenswrapper[4807]: I1202 21:25:35.988520 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7hxrb/must-gather-lx7rf" event={"ID":"c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8","Type":"ContainerStarted","Data":"9f0993f839953fa5666b9be8dd5855b6511c6682ff112ad1d8229ed9c10f9b49"} Dec 02 21:25:38 crc kubenswrapper[4807]: I1202 21:25:38.972858 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:25:38 crc kubenswrapper[4807]: E1202 21:25:38.973895 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:25:40 crc kubenswrapper[4807]: I1202 21:25:40.036810 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7hxrb/must-gather-lx7rf" event={"ID":"c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8","Type":"ContainerStarted","Data":"944da8437d8fce7c39b4f18c52c171de3bf2c11b0a908439d59cf47fdfd16acb"} Dec 02 21:25:41 crc kubenswrapper[4807]: I1202 21:25:41.057752 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7hxrb/must-gather-lx7rf" event={"ID":"c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8","Type":"ContainerStarted","Data":"a0c400eac489521c1301d42c8c239c734fbce30a6cddf1ca737f24082266f21e"} Dec 02 21:25:41 crc kubenswrapper[4807]: I1202 21:25:41.084532 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7hxrb/must-gather-lx7rf" podStartSLOduration=2.969840429 podStartE2EDuration="7.084506779s" podCreationTimestamp="2025-12-02 21:25:34 +0000 UTC" firstStartedPulling="2025-12-02 21:25:35.404840391 +0000 UTC m=+5270.705747926" lastFinishedPulling="2025-12-02 21:25:39.519506741 +0000 UTC m=+5274.820414276" observedRunningTime="2025-12-02 21:25:41.081964666 +0000 UTC m=+5276.382872211" watchObservedRunningTime="2025-12-02 21:25:41.084506779 +0000 UTC m=+5276.385414294" Dec 02 21:25:43 crc kubenswrapper[4807]: I1202 21:25:43.711067 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7hxrb/crc-debug-8wqpn"] Dec 02 21:25:43 crc kubenswrapper[4807]: I1202 21:25:43.713092 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hxrb/crc-debug-8wqpn" Dec 02 21:25:43 crc kubenswrapper[4807]: I1202 21:25:43.826830 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85ed5ed1-0df4-4b12-88b3-4f919c507be2-host\") pod \"crc-debug-8wqpn\" (UID: \"85ed5ed1-0df4-4b12-88b3-4f919c507be2\") " pod="openshift-must-gather-7hxrb/crc-debug-8wqpn" Dec 02 21:25:43 crc kubenswrapper[4807]: I1202 21:25:43.827208 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d2kn\" (UniqueName: \"kubernetes.io/projected/85ed5ed1-0df4-4b12-88b3-4f919c507be2-kube-api-access-6d2kn\") pod \"crc-debug-8wqpn\" (UID: \"85ed5ed1-0df4-4b12-88b3-4f919c507be2\") " pod="openshift-must-gather-7hxrb/crc-debug-8wqpn" Dec 02 21:25:43 crc kubenswrapper[4807]: I1202 21:25:43.929435 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d2kn\" (UniqueName: \"kubernetes.io/projected/85ed5ed1-0df4-4b12-88b3-4f919c507be2-kube-api-access-6d2kn\") pod \"crc-debug-8wqpn\" (UID: \"85ed5ed1-0df4-4b12-88b3-4f919c507be2\") " pod="openshift-must-gather-7hxrb/crc-debug-8wqpn" Dec 02 21:25:43 crc kubenswrapper[4807]: I1202 21:25:43.929600 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85ed5ed1-0df4-4b12-88b3-4f919c507be2-host\") pod \"crc-debug-8wqpn\" (UID: \"85ed5ed1-0df4-4b12-88b3-4f919c507be2\") " pod="openshift-must-gather-7hxrb/crc-debug-8wqpn" Dec 02 21:25:43 crc kubenswrapper[4807]: I1202 21:25:43.929740 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85ed5ed1-0df4-4b12-88b3-4f919c507be2-host\") pod \"crc-debug-8wqpn\" (UID: \"85ed5ed1-0df4-4b12-88b3-4f919c507be2\") " pod="openshift-must-gather-7hxrb/crc-debug-8wqpn" Dec 02 21:25:43 crc kubenswrapper[4807]: I1202 21:25:43.950048 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d2kn\" (UniqueName: \"kubernetes.io/projected/85ed5ed1-0df4-4b12-88b3-4f919c507be2-kube-api-access-6d2kn\") pod \"crc-debug-8wqpn\" (UID: \"85ed5ed1-0df4-4b12-88b3-4f919c507be2\") " pod="openshift-must-gather-7hxrb/crc-debug-8wqpn" Dec 02 21:25:44 crc kubenswrapper[4807]: I1202 21:25:44.046981 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hxrb/crc-debug-8wqpn" Dec 02 21:25:45 crc kubenswrapper[4807]: I1202 21:25:45.113394 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7hxrb/crc-debug-8wqpn" event={"ID":"85ed5ed1-0df4-4b12-88b3-4f919c507be2","Type":"ContainerStarted","Data":"087837af4a9dbc8b545928ba43beadcde5fe688f01232e4ae6723657a818097f"} Dec 02 21:25:51 crc kubenswrapper[4807]: I1202 21:25:51.972818 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:25:51 crc kubenswrapper[4807]: E1202 21:25:51.973742 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:25:53 crc kubenswrapper[4807]: I1202 21:25:53.109796 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-58z78"] Dec 02 21:25:53 crc kubenswrapper[4807]: I1202 21:25:53.112875 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:25:53 crc kubenswrapper[4807]: I1202 21:25:53.121638 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-58z78"] Dec 02 21:25:53 crc kubenswrapper[4807]: I1202 21:25:53.232847 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-utilities\") pod \"certified-operators-58z78\" (UID: \"7b3c67ca-b5ed-431d-9ef9-a8252f832a99\") " pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:25:53 crc kubenswrapper[4807]: I1202 21:25:53.232914 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srf7k\" (UniqueName: \"kubernetes.io/projected/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-kube-api-access-srf7k\") pod \"certified-operators-58z78\" (UID: \"7b3c67ca-b5ed-431d-9ef9-a8252f832a99\") " pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:25:53 crc kubenswrapper[4807]: I1202 21:25:53.232968 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-catalog-content\") pod \"certified-operators-58z78\" (UID: \"7b3c67ca-b5ed-431d-9ef9-a8252f832a99\") " pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:25:53 crc kubenswrapper[4807]: I1202 21:25:53.335046 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-utilities\") pod \"certified-operators-58z78\" (UID: \"7b3c67ca-b5ed-431d-9ef9-a8252f832a99\") " pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:25:53 crc kubenswrapper[4807]: I1202 21:25:53.335154 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srf7k\" (UniqueName: \"kubernetes.io/projected/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-kube-api-access-srf7k\") pod \"certified-operators-58z78\" (UID: \"7b3c67ca-b5ed-431d-9ef9-a8252f832a99\") " pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:25:53 crc kubenswrapper[4807]: I1202 21:25:53.335209 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-catalog-content\") pod \"certified-operators-58z78\" (UID: \"7b3c67ca-b5ed-431d-9ef9-a8252f832a99\") " pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:25:53 crc kubenswrapper[4807]: I1202 21:25:53.335706 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-utilities\") pod \"certified-operators-58z78\" (UID: \"7b3c67ca-b5ed-431d-9ef9-a8252f832a99\") " pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:25:53 crc kubenswrapper[4807]: I1202 21:25:53.335753 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-catalog-content\") pod \"certified-operators-58z78\" (UID: \"7b3c67ca-b5ed-431d-9ef9-a8252f832a99\") " pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:25:53 crc kubenswrapper[4807]: I1202 21:25:53.387894 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srf7k\" (UniqueName: \"kubernetes.io/projected/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-kube-api-access-srf7k\") pod \"certified-operators-58z78\" (UID: \"7b3c67ca-b5ed-431d-9ef9-a8252f832a99\") " pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:25:53 crc kubenswrapper[4807]: I1202 21:25:53.454054 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:25:54 crc kubenswrapper[4807]: I1202 21:25:54.354567 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-58z78"] Dec 02 21:25:54 crc kubenswrapper[4807]: W1202 21:25:54.384661 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b3c67ca_b5ed_431d_9ef9_a8252f832a99.slice/crio-e59f7f56c8787f07ff60fdeb8912c9540e3ad0f8abf2e1c3376174c317ee3ad5 WatchSource:0}: Error finding container e59f7f56c8787f07ff60fdeb8912c9540e3ad0f8abf2e1c3376174c317ee3ad5: Status 404 returned error can't find the container with id e59f7f56c8787f07ff60fdeb8912c9540e3ad0f8abf2e1c3376174c317ee3ad5 Dec 02 21:25:55 crc kubenswrapper[4807]: I1202 21:25:55.319120 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7hxrb/crc-debug-8wqpn" event={"ID":"85ed5ed1-0df4-4b12-88b3-4f919c507be2","Type":"ContainerStarted","Data":"4fc04e70826148105bea0d04586d5023cae7f60628adc130cb839ab2ba04edfd"} Dec 02 21:25:55 crc kubenswrapper[4807]: I1202 21:25:55.321574 4807 generic.go:334] "Generic (PLEG): container finished" podID="7b3c67ca-b5ed-431d-9ef9-a8252f832a99" containerID="6d527b7208498ae51258e33156deb76fdaed528bcaa12bc5a1d52601664a22e5" exitCode=0 Dec 02 21:25:55 crc kubenswrapper[4807]: I1202 21:25:55.321676 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58z78" event={"ID":"7b3c67ca-b5ed-431d-9ef9-a8252f832a99","Type":"ContainerDied","Data":"6d527b7208498ae51258e33156deb76fdaed528bcaa12bc5a1d52601664a22e5"} Dec 02 21:25:55 crc kubenswrapper[4807]: I1202 21:25:55.321772 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58z78" event={"ID":"7b3c67ca-b5ed-431d-9ef9-a8252f832a99","Type":"ContainerStarted","Data":"e59f7f56c8787f07ff60fdeb8912c9540e3ad0f8abf2e1c3376174c317ee3ad5"} Dec 02 21:25:55 crc kubenswrapper[4807]: I1202 21:25:55.340276 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7hxrb/crc-debug-8wqpn" podStartSLOduration=2.422205635 podStartE2EDuration="12.340251566s" podCreationTimestamp="2025-12-02 21:25:43 +0000 UTC" firstStartedPulling="2025-12-02 21:25:44.084448701 +0000 UTC m=+5279.385356206" lastFinishedPulling="2025-12-02 21:25:54.002494642 +0000 UTC m=+5289.303402137" observedRunningTime="2025-12-02 21:25:55.334860981 +0000 UTC m=+5290.635768476" watchObservedRunningTime="2025-12-02 21:25:55.340251566 +0000 UTC m=+5290.641159061" Dec 02 21:25:56 crc kubenswrapper[4807]: I1202 21:25:56.335146 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58z78" event={"ID":"7b3c67ca-b5ed-431d-9ef9-a8252f832a99","Type":"ContainerStarted","Data":"47cb43dd0827378a99c3b67b53a56d584daa8cff2c1f7f2b1c70d03c2a8a1e6b"} Dec 02 21:25:57 crc kubenswrapper[4807]: I1202 21:25:57.345688 4807 generic.go:334] "Generic (PLEG): container finished" podID="7b3c67ca-b5ed-431d-9ef9-a8252f832a99" containerID="47cb43dd0827378a99c3b67b53a56d584daa8cff2c1f7f2b1c70d03c2a8a1e6b" exitCode=0 Dec 02 21:25:57 crc kubenswrapper[4807]: I1202 21:25:57.346216 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58z78" event={"ID":"7b3c67ca-b5ed-431d-9ef9-a8252f832a99","Type":"ContainerDied","Data":"47cb43dd0827378a99c3b67b53a56d584daa8cff2c1f7f2b1c70d03c2a8a1e6b"} Dec 02 21:26:00 crc kubenswrapper[4807]: I1202 21:26:00.377942 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58z78" event={"ID":"7b3c67ca-b5ed-431d-9ef9-a8252f832a99","Type":"ContainerStarted","Data":"e8e9ea69ba790e79cb5037dcbe674747a6b0fb24a6084c5f07bffd897cd81db0"} Dec 02 21:26:00 crc kubenswrapper[4807]: I1202 21:26:00.410481 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-58z78" podStartSLOduration=3.252278015 podStartE2EDuration="7.410460635s" podCreationTimestamp="2025-12-02 21:25:53 +0000 UTC" firstStartedPulling="2025-12-02 21:25:55.324331369 +0000 UTC m=+5290.625238864" lastFinishedPulling="2025-12-02 21:25:59.482513979 +0000 UTC m=+5294.783421484" observedRunningTime="2025-12-02 21:26:00.405210144 +0000 UTC m=+5295.706117649" watchObservedRunningTime="2025-12-02 21:26:00.410460635 +0000 UTC m=+5295.711368140" Dec 02 21:26:03 crc kubenswrapper[4807]: I1202 21:26:03.456111 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:26:03 crc kubenswrapper[4807]: I1202 21:26:03.456589 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:26:03 crc kubenswrapper[4807]: I1202 21:26:03.510706 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:26:04 crc kubenswrapper[4807]: I1202 21:26:04.464969 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:26:04 crc kubenswrapper[4807]: I1202 21:26:04.524551 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-58z78"] Dec 02 21:26:05 crc kubenswrapper[4807]: I1202 21:26:05.973340 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:26:05 crc kubenswrapper[4807]: E1202 21:26:05.974243 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:26:06 crc kubenswrapper[4807]: I1202 21:26:06.432956 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-58z78" podUID="7b3c67ca-b5ed-431d-9ef9-a8252f832a99" containerName="registry-server" containerID="cri-o://e8e9ea69ba790e79cb5037dcbe674747a6b0fb24a6084c5f07bffd897cd81db0" gracePeriod=2 Dec 02 21:26:06 crc kubenswrapper[4807]: I1202 21:26:06.952002 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.023154 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-catalog-content\") pod \"7b3c67ca-b5ed-431d-9ef9-a8252f832a99\" (UID: \"7b3c67ca-b5ed-431d-9ef9-a8252f832a99\") " Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.024170 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srf7k\" (UniqueName: \"kubernetes.io/projected/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-kube-api-access-srf7k\") pod \"7b3c67ca-b5ed-431d-9ef9-a8252f832a99\" (UID: \"7b3c67ca-b5ed-431d-9ef9-a8252f832a99\") " Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.024386 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-utilities\") pod \"7b3c67ca-b5ed-431d-9ef9-a8252f832a99\" (UID: \"7b3c67ca-b5ed-431d-9ef9-a8252f832a99\") " Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.025222 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-utilities" (OuterVolumeSpecName: "utilities") pod "7b3c67ca-b5ed-431d-9ef9-a8252f832a99" (UID: "7b3c67ca-b5ed-431d-9ef9-a8252f832a99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.029313 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-kube-api-access-srf7k" (OuterVolumeSpecName: "kube-api-access-srf7k") pod "7b3c67ca-b5ed-431d-9ef9-a8252f832a99" (UID: "7b3c67ca-b5ed-431d-9ef9-a8252f832a99"). InnerVolumeSpecName "kube-api-access-srf7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.067982 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b3c67ca-b5ed-431d-9ef9-a8252f832a99" (UID: "7b3c67ca-b5ed-431d-9ef9-a8252f832a99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.126695 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srf7k\" (UniqueName: \"kubernetes.io/projected/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-kube-api-access-srf7k\") on node \"crc\" DevicePath \"\"" Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.126952 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.126962 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b3c67ca-b5ed-431d-9ef9-a8252f832a99-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.444198 4807 generic.go:334] "Generic (PLEG): container finished" podID="7b3c67ca-b5ed-431d-9ef9-a8252f832a99" containerID="e8e9ea69ba790e79cb5037dcbe674747a6b0fb24a6084c5f07bffd897cd81db0" exitCode=0 Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.444243 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58z78" event={"ID":"7b3c67ca-b5ed-431d-9ef9-a8252f832a99","Type":"ContainerDied","Data":"e8e9ea69ba790e79cb5037dcbe674747a6b0fb24a6084c5f07bffd897cd81db0"} Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.444274 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58z78" event={"ID":"7b3c67ca-b5ed-431d-9ef9-a8252f832a99","Type":"ContainerDied","Data":"e59f7f56c8787f07ff60fdeb8912c9540e3ad0f8abf2e1c3376174c317ee3ad5"} Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.444293 4807 scope.go:117] "RemoveContainer" containerID="e8e9ea69ba790e79cb5037dcbe674747a6b0fb24a6084c5f07bffd897cd81db0" Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.444416 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58z78" Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.490431 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-58z78"] Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.501013 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-58z78"] Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.502303 4807 scope.go:117] "RemoveContainer" containerID="47cb43dd0827378a99c3b67b53a56d584daa8cff2c1f7f2b1c70d03c2a8a1e6b" Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.528174 4807 scope.go:117] "RemoveContainer" containerID="6d527b7208498ae51258e33156deb76fdaed528bcaa12bc5a1d52601664a22e5" Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.581929 4807 scope.go:117] "RemoveContainer" containerID="e8e9ea69ba790e79cb5037dcbe674747a6b0fb24a6084c5f07bffd897cd81db0" Dec 02 21:26:07 crc kubenswrapper[4807]: E1202 21:26:07.582389 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8e9ea69ba790e79cb5037dcbe674747a6b0fb24a6084c5f07bffd897cd81db0\": container with ID starting with e8e9ea69ba790e79cb5037dcbe674747a6b0fb24a6084c5f07bffd897cd81db0 not found: ID does not exist" containerID="e8e9ea69ba790e79cb5037dcbe674747a6b0fb24a6084c5f07bffd897cd81db0" Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.582420 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8e9ea69ba790e79cb5037dcbe674747a6b0fb24a6084c5f07bffd897cd81db0"} err="failed to get container status \"e8e9ea69ba790e79cb5037dcbe674747a6b0fb24a6084c5f07bffd897cd81db0\": rpc error: code = NotFound desc = could not find container \"e8e9ea69ba790e79cb5037dcbe674747a6b0fb24a6084c5f07bffd897cd81db0\": container with ID starting with e8e9ea69ba790e79cb5037dcbe674747a6b0fb24a6084c5f07bffd897cd81db0 not found: ID does not exist" Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.582441 4807 scope.go:117] "RemoveContainer" containerID="47cb43dd0827378a99c3b67b53a56d584daa8cff2c1f7f2b1c70d03c2a8a1e6b" Dec 02 21:26:07 crc kubenswrapper[4807]: E1202 21:26:07.582646 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47cb43dd0827378a99c3b67b53a56d584daa8cff2c1f7f2b1c70d03c2a8a1e6b\": container with ID starting with 47cb43dd0827378a99c3b67b53a56d584daa8cff2c1f7f2b1c70d03c2a8a1e6b not found: ID does not exist" containerID="47cb43dd0827378a99c3b67b53a56d584daa8cff2c1f7f2b1c70d03c2a8a1e6b" Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.582667 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47cb43dd0827378a99c3b67b53a56d584daa8cff2c1f7f2b1c70d03c2a8a1e6b"} err="failed to get container status \"47cb43dd0827378a99c3b67b53a56d584daa8cff2c1f7f2b1c70d03c2a8a1e6b\": rpc error: code = NotFound desc = could not find container \"47cb43dd0827378a99c3b67b53a56d584daa8cff2c1f7f2b1c70d03c2a8a1e6b\": container with ID starting with 47cb43dd0827378a99c3b67b53a56d584daa8cff2c1f7f2b1c70d03c2a8a1e6b not found: ID does not exist" Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.582679 4807 scope.go:117] "RemoveContainer" containerID="6d527b7208498ae51258e33156deb76fdaed528bcaa12bc5a1d52601664a22e5" Dec 02 21:26:07 crc kubenswrapper[4807]: E1202 21:26:07.582863 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d527b7208498ae51258e33156deb76fdaed528bcaa12bc5a1d52601664a22e5\": container with ID starting with 6d527b7208498ae51258e33156deb76fdaed528bcaa12bc5a1d52601664a22e5 not found: ID does not exist" containerID="6d527b7208498ae51258e33156deb76fdaed528bcaa12bc5a1d52601664a22e5" Dec 02 21:26:07 crc kubenswrapper[4807]: I1202 21:26:07.582884 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d527b7208498ae51258e33156deb76fdaed528bcaa12bc5a1d52601664a22e5"} err="failed to get container status \"6d527b7208498ae51258e33156deb76fdaed528bcaa12bc5a1d52601664a22e5\": rpc error: code = NotFound desc = could not find container \"6d527b7208498ae51258e33156deb76fdaed528bcaa12bc5a1d52601664a22e5\": container with ID starting with 6d527b7208498ae51258e33156deb76fdaed528bcaa12bc5a1d52601664a22e5 not found: ID does not exist" Dec 02 21:26:08 crc kubenswrapper[4807]: I1202 21:26:08.983875 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b3c67ca-b5ed-431d-9ef9-a8252f832a99" path="/var/lib/kubelet/pods/7b3c67ca-b5ed-431d-9ef9-a8252f832a99/volumes" Dec 02 21:26:20 crc kubenswrapper[4807]: I1202 21:26:20.973069 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:26:20 crc kubenswrapper[4807]: E1202 21:26:20.973943 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:26:31 crc kubenswrapper[4807]: I1202 21:26:31.973412 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:26:31 crc kubenswrapper[4807]: E1202 21:26:31.974173 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:26:42 crc kubenswrapper[4807]: I1202 21:26:42.973403 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:26:42 crc kubenswrapper[4807]: E1202 21:26:42.974266 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:26:43 crc kubenswrapper[4807]: I1202 21:26:43.812643 4807 generic.go:334] "Generic (PLEG): container finished" podID="85ed5ed1-0df4-4b12-88b3-4f919c507be2" containerID="4fc04e70826148105bea0d04586d5023cae7f60628adc130cb839ab2ba04edfd" exitCode=0 Dec 02 21:26:43 crc kubenswrapper[4807]: I1202 21:26:43.812781 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7hxrb/crc-debug-8wqpn" event={"ID":"85ed5ed1-0df4-4b12-88b3-4f919c507be2","Type":"ContainerDied","Data":"4fc04e70826148105bea0d04586d5023cae7f60628adc130cb839ab2ba04edfd"} Dec 02 21:26:44 crc kubenswrapper[4807]: I1202 21:26:44.964316 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hxrb/crc-debug-8wqpn" Dec 02 21:26:45 crc kubenswrapper[4807]: I1202 21:26:45.015511 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7hxrb/crc-debug-8wqpn"] Dec 02 21:26:45 crc kubenswrapper[4807]: I1202 21:26:45.026395 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7hxrb/crc-debug-8wqpn"] Dec 02 21:26:45 crc kubenswrapper[4807]: I1202 21:26:45.045672 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85ed5ed1-0df4-4b12-88b3-4f919c507be2-host\") pod \"85ed5ed1-0df4-4b12-88b3-4f919c507be2\" (UID: \"85ed5ed1-0df4-4b12-88b3-4f919c507be2\") " Dec 02 21:26:45 crc kubenswrapper[4807]: I1202 21:26:45.045736 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d2kn\" (UniqueName: \"kubernetes.io/projected/85ed5ed1-0df4-4b12-88b3-4f919c507be2-kube-api-access-6d2kn\") pod \"85ed5ed1-0df4-4b12-88b3-4f919c507be2\" (UID: \"85ed5ed1-0df4-4b12-88b3-4f919c507be2\") " Dec 02 21:26:45 crc kubenswrapper[4807]: I1202 21:26:45.045869 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85ed5ed1-0df4-4b12-88b3-4f919c507be2-host" (OuterVolumeSpecName: "host") pod "85ed5ed1-0df4-4b12-88b3-4f919c507be2" (UID: "85ed5ed1-0df4-4b12-88b3-4f919c507be2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 21:26:45 crc kubenswrapper[4807]: I1202 21:26:45.046434 4807 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85ed5ed1-0df4-4b12-88b3-4f919c507be2-host\") on node \"crc\" DevicePath \"\"" Dec 02 21:26:45 crc kubenswrapper[4807]: I1202 21:26:45.062952 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ed5ed1-0df4-4b12-88b3-4f919c507be2-kube-api-access-6d2kn" (OuterVolumeSpecName: "kube-api-access-6d2kn") pod "85ed5ed1-0df4-4b12-88b3-4f919c507be2" (UID: "85ed5ed1-0df4-4b12-88b3-4f919c507be2"). InnerVolumeSpecName "kube-api-access-6d2kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:26:45 crc kubenswrapper[4807]: I1202 21:26:45.148493 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d2kn\" (UniqueName: \"kubernetes.io/projected/85ed5ed1-0df4-4b12-88b3-4f919c507be2-kube-api-access-6d2kn\") on node \"crc\" DevicePath \"\"" Dec 02 21:26:45 crc kubenswrapper[4807]: I1202 21:26:45.835594 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="087837af4a9dbc8b545928ba43beadcde5fe688f01232e4ae6723657a818097f" Dec 02 21:26:45 crc kubenswrapper[4807]: I1202 21:26:45.835763 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hxrb/crc-debug-8wqpn" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.238682 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7hxrb/crc-debug-bqlrt"] Dec 02 21:26:46 crc kubenswrapper[4807]: E1202 21:26:46.239383 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b3c67ca-b5ed-431d-9ef9-a8252f832a99" containerName="extract-content" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.239396 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3c67ca-b5ed-431d-9ef9-a8252f832a99" containerName="extract-content" Dec 02 21:26:46 crc kubenswrapper[4807]: E1202 21:26:46.239406 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b3c67ca-b5ed-431d-9ef9-a8252f832a99" containerName="extract-utilities" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.239412 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3c67ca-b5ed-431d-9ef9-a8252f832a99" containerName="extract-utilities" Dec 02 21:26:46 crc kubenswrapper[4807]: E1202 21:26:46.239429 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b3c67ca-b5ed-431d-9ef9-a8252f832a99" containerName="registry-server" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.239436 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3c67ca-b5ed-431d-9ef9-a8252f832a99" containerName="registry-server" Dec 02 21:26:46 crc kubenswrapper[4807]: E1202 21:26:46.239462 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ed5ed1-0df4-4b12-88b3-4f919c507be2" containerName="container-00" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.239468 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ed5ed1-0df4-4b12-88b3-4f919c507be2" containerName="container-00" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.239633 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="85ed5ed1-0df4-4b12-88b3-4f919c507be2" containerName="container-00" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.239647 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b3c67ca-b5ed-431d-9ef9-a8252f832a99" containerName="registry-server" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.240393 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hxrb/crc-debug-bqlrt" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.370251 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l27n\" (UniqueName: \"kubernetes.io/projected/cd4d2387-d437-444d-894a-1d22c448b5e3-kube-api-access-6l27n\") pod \"crc-debug-bqlrt\" (UID: \"cd4d2387-d437-444d-894a-1d22c448b5e3\") " pod="openshift-must-gather-7hxrb/crc-debug-bqlrt" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.370347 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd4d2387-d437-444d-894a-1d22c448b5e3-host\") pod \"crc-debug-bqlrt\" (UID: \"cd4d2387-d437-444d-894a-1d22c448b5e3\") " pod="openshift-must-gather-7hxrb/crc-debug-bqlrt" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.472538 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l27n\" (UniqueName: \"kubernetes.io/projected/cd4d2387-d437-444d-894a-1d22c448b5e3-kube-api-access-6l27n\") pod \"crc-debug-bqlrt\" (UID: \"cd4d2387-d437-444d-894a-1d22c448b5e3\") " pod="openshift-must-gather-7hxrb/crc-debug-bqlrt" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.472632 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd4d2387-d437-444d-894a-1d22c448b5e3-host\") pod \"crc-debug-bqlrt\" (UID: \"cd4d2387-d437-444d-894a-1d22c448b5e3\") " pod="openshift-must-gather-7hxrb/crc-debug-bqlrt" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.472902 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd4d2387-d437-444d-894a-1d22c448b5e3-host\") pod \"crc-debug-bqlrt\" (UID: \"cd4d2387-d437-444d-894a-1d22c448b5e3\") " pod="openshift-must-gather-7hxrb/crc-debug-bqlrt" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.496307 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l27n\" (UniqueName: \"kubernetes.io/projected/cd4d2387-d437-444d-894a-1d22c448b5e3-kube-api-access-6l27n\") pod \"crc-debug-bqlrt\" (UID: \"cd4d2387-d437-444d-894a-1d22c448b5e3\") " pod="openshift-must-gather-7hxrb/crc-debug-bqlrt" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.560667 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hxrb/crc-debug-bqlrt" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.847233 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7hxrb/crc-debug-bqlrt" event={"ID":"cd4d2387-d437-444d-894a-1d22c448b5e3","Type":"ContainerStarted","Data":"5eea71c0ed094b5bcfe6eacde1ca7c2459aa6beefa57ceec27c6f305a297e722"} Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.847650 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7hxrb/crc-debug-bqlrt" event={"ID":"cd4d2387-d437-444d-894a-1d22c448b5e3","Type":"ContainerStarted","Data":"6335fe1419be468ae144596af3a8873bb0745b423c705f894910af848bda4760"} Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.865961 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7hxrb/crc-debug-bqlrt" podStartSLOduration=0.865934771 podStartE2EDuration="865.934771ms" podCreationTimestamp="2025-12-02 21:26:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 21:26:46.858126637 +0000 UTC m=+5342.159034132" watchObservedRunningTime="2025-12-02 21:26:46.865934771 +0000 UTC m=+5342.166842286" Dec 02 21:26:46 crc kubenswrapper[4807]: I1202 21:26:46.983800 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85ed5ed1-0df4-4b12-88b3-4f919c507be2" path="/var/lib/kubelet/pods/85ed5ed1-0df4-4b12-88b3-4f919c507be2/volumes" Dec 02 21:26:47 crc kubenswrapper[4807]: I1202 21:26:47.857623 4807 generic.go:334] "Generic (PLEG): container finished" podID="cd4d2387-d437-444d-894a-1d22c448b5e3" containerID="5eea71c0ed094b5bcfe6eacde1ca7c2459aa6beefa57ceec27c6f305a297e722" exitCode=0 Dec 02 21:26:47 crc kubenswrapper[4807]: I1202 21:26:47.857737 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7hxrb/crc-debug-bqlrt" event={"ID":"cd4d2387-d437-444d-894a-1d22c448b5e3","Type":"ContainerDied","Data":"5eea71c0ed094b5bcfe6eacde1ca7c2459aa6beefa57ceec27c6f305a297e722"} Dec 02 21:26:48 crc kubenswrapper[4807]: I1202 21:26:48.961496 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hxrb/crc-debug-bqlrt" Dec 02 21:26:49 crc kubenswrapper[4807]: I1202 21:26:49.016319 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l27n\" (UniqueName: \"kubernetes.io/projected/cd4d2387-d437-444d-894a-1d22c448b5e3-kube-api-access-6l27n\") pod \"cd4d2387-d437-444d-894a-1d22c448b5e3\" (UID: \"cd4d2387-d437-444d-894a-1d22c448b5e3\") " Dec 02 21:26:49 crc kubenswrapper[4807]: I1202 21:26:49.016572 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd4d2387-d437-444d-894a-1d22c448b5e3-host\") pod \"cd4d2387-d437-444d-894a-1d22c448b5e3\" (UID: \"cd4d2387-d437-444d-894a-1d22c448b5e3\") " Dec 02 21:26:49 crc kubenswrapper[4807]: I1202 21:26:49.017091 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd4d2387-d437-444d-894a-1d22c448b5e3-host" (OuterVolumeSpecName: "host") pod "cd4d2387-d437-444d-894a-1d22c448b5e3" (UID: "cd4d2387-d437-444d-894a-1d22c448b5e3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 21:26:49 crc kubenswrapper[4807]: I1202 21:26:49.041482 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd4d2387-d437-444d-894a-1d22c448b5e3-kube-api-access-6l27n" (OuterVolumeSpecName: "kube-api-access-6l27n") pod "cd4d2387-d437-444d-894a-1d22c448b5e3" (UID: "cd4d2387-d437-444d-894a-1d22c448b5e3"). InnerVolumeSpecName "kube-api-access-6l27n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:26:49 crc kubenswrapper[4807]: I1202 21:26:49.120312 4807 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd4d2387-d437-444d-894a-1d22c448b5e3-host\") on node \"crc\" DevicePath \"\"" Dec 02 21:26:49 crc kubenswrapper[4807]: I1202 21:26:49.120470 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l27n\" (UniqueName: \"kubernetes.io/projected/cd4d2387-d437-444d-894a-1d22c448b5e3-kube-api-access-6l27n\") on node \"crc\" DevicePath \"\"" Dec 02 21:26:49 crc kubenswrapper[4807]: I1202 21:26:49.229557 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7hxrb/crc-debug-bqlrt"] Dec 02 21:26:49 crc kubenswrapper[4807]: I1202 21:26:49.237928 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7hxrb/crc-debug-bqlrt"] Dec 02 21:26:49 crc kubenswrapper[4807]: I1202 21:26:49.874952 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6335fe1419be468ae144596af3a8873bb0745b423c705f894910af848bda4760" Dec 02 21:26:49 crc kubenswrapper[4807]: I1202 21:26:49.875005 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hxrb/crc-debug-bqlrt" Dec 02 21:26:50 crc kubenswrapper[4807]: I1202 21:26:50.525531 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7hxrb/crc-debug-kfx8c"] Dec 02 21:26:50 crc kubenswrapper[4807]: E1202 21:26:50.527057 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd4d2387-d437-444d-894a-1d22c448b5e3" containerName="container-00" Dec 02 21:26:50 crc kubenswrapper[4807]: I1202 21:26:50.527086 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd4d2387-d437-444d-894a-1d22c448b5e3" containerName="container-00" Dec 02 21:26:50 crc kubenswrapper[4807]: I1202 21:26:50.527793 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd4d2387-d437-444d-894a-1d22c448b5e3" containerName="container-00" Dec 02 21:26:50 crc kubenswrapper[4807]: I1202 21:26:50.529798 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hxrb/crc-debug-kfx8c" Dec 02 21:26:50 crc kubenswrapper[4807]: I1202 21:26:50.650176 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c587dbf2-002d-4350-8165-c0555bc2edf9-host\") pod \"crc-debug-kfx8c\" (UID: \"c587dbf2-002d-4350-8165-c0555bc2edf9\") " pod="openshift-must-gather-7hxrb/crc-debug-kfx8c" Dec 02 21:26:50 crc kubenswrapper[4807]: I1202 21:26:50.650471 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pphpv\" (UniqueName: \"kubernetes.io/projected/c587dbf2-002d-4350-8165-c0555bc2edf9-kube-api-access-pphpv\") pod \"crc-debug-kfx8c\" (UID: \"c587dbf2-002d-4350-8165-c0555bc2edf9\") " pod="openshift-must-gather-7hxrb/crc-debug-kfx8c" Dec 02 21:26:50 crc kubenswrapper[4807]: I1202 21:26:50.753452 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pphpv\" (UniqueName: \"kubernetes.io/projected/c587dbf2-002d-4350-8165-c0555bc2edf9-kube-api-access-pphpv\") pod \"crc-debug-kfx8c\" (UID: \"c587dbf2-002d-4350-8165-c0555bc2edf9\") " pod="openshift-must-gather-7hxrb/crc-debug-kfx8c" Dec 02 21:26:50 crc kubenswrapper[4807]: I1202 21:26:50.753665 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c587dbf2-002d-4350-8165-c0555bc2edf9-host\") pod \"crc-debug-kfx8c\" (UID: \"c587dbf2-002d-4350-8165-c0555bc2edf9\") " pod="openshift-must-gather-7hxrb/crc-debug-kfx8c" Dec 02 21:26:50 crc kubenswrapper[4807]: I1202 21:26:50.753898 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c587dbf2-002d-4350-8165-c0555bc2edf9-host\") pod \"crc-debug-kfx8c\" (UID: \"c587dbf2-002d-4350-8165-c0555bc2edf9\") " pod="openshift-must-gather-7hxrb/crc-debug-kfx8c" Dec 02 21:26:50 crc kubenswrapper[4807]: I1202 21:26:50.780319 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pphpv\" (UniqueName: \"kubernetes.io/projected/c587dbf2-002d-4350-8165-c0555bc2edf9-kube-api-access-pphpv\") pod \"crc-debug-kfx8c\" (UID: \"c587dbf2-002d-4350-8165-c0555bc2edf9\") " pod="openshift-must-gather-7hxrb/crc-debug-kfx8c" Dec 02 21:26:50 crc kubenswrapper[4807]: I1202 21:26:50.862420 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hxrb/crc-debug-kfx8c" Dec 02 21:26:50 crc kubenswrapper[4807]: W1202 21:26:50.903092 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc587dbf2_002d_4350_8165_c0555bc2edf9.slice/crio-b1e3d9a181cb20cabda67e87e437b719f6d3b18213d4435e851bb047d4823c77 WatchSource:0}: Error finding container b1e3d9a181cb20cabda67e87e437b719f6d3b18213d4435e851bb047d4823c77: Status 404 returned error can't find the container with id b1e3d9a181cb20cabda67e87e437b719f6d3b18213d4435e851bb047d4823c77 Dec 02 21:26:50 crc kubenswrapper[4807]: I1202 21:26:50.988451 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd4d2387-d437-444d-894a-1d22c448b5e3" path="/var/lib/kubelet/pods/cd4d2387-d437-444d-894a-1d22c448b5e3/volumes" Dec 02 21:26:51 crc kubenswrapper[4807]: I1202 21:26:51.901809 4807 generic.go:334] "Generic (PLEG): container finished" podID="c587dbf2-002d-4350-8165-c0555bc2edf9" containerID="ddf9677736b273cf7e39537bfb022b473f48e3bf54102fb1f2d37ac3b81b53c2" exitCode=0 Dec 02 21:26:51 crc kubenswrapper[4807]: I1202 21:26:51.901943 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7hxrb/crc-debug-kfx8c" event={"ID":"c587dbf2-002d-4350-8165-c0555bc2edf9","Type":"ContainerDied","Data":"ddf9677736b273cf7e39537bfb022b473f48e3bf54102fb1f2d37ac3b81b53c2"} Dec 02 21:26:51 crc kubenswrapper[4807]: I1202 21:26:51.902205 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7hxrb/crc-debug-kfx8c" event={"ID":"c587dbf2-002d-4350-8165-c0555bc2edf9","Type":"ContainerStarted","Data":"b1e3d9a181cb20cabda67e87e437b719f6d3b18213d4435e851bb047d4823c77"} Dec 02 21:26:51 crc kubenswrapper[4807]: I1202 21:26:51.959339 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7hxrb/crc-debug-kfx8c"] Dec 02 21:26:51 crc kubenswrapper[4807]: I1202 21:26:51.976878 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7hxrb/crc-debug-kfx8c"] Dec 02 21:26:53 crc kubenswrapper[4807]: I1202 21:26:53.023043 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hxrb/crc-debug-kfx8c" Dec 02 21:26:53 crc kubenswrapper[4807]: I1202 21:26:53.107815 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c587dbf2-002d-4350-8165-c0555bc2edf9-host\") pod \"c587dbf2-002d-4350-8165-c0555bc2edf9\" (UID: \"c587dbf2-002d-4350-8165-c0555bc2edf9\") " Dec 02 21:26:53 crc kubenswrapper[4807]: I1202 21:26:53.107910 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c587dbf2-002d-4350-8165-c0555bc2edf9-host" (OuterVolumeSpecName: "host") pod "c587dbf2-002d-4350-8165-c0555bc2edf9" (UID: "c587dbf2-002d-4350-8165-c0555bc2edf9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 21:26:53 crc kubenswrapper[4807]: I1202 21:26:53.108032 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pphpv\" (UniqueName: \"kubernetes.io/projected/c587dbf2-002d-4350-8165-c0555bc2edf9-kube-api-access-pphpv\") pod \"c587dbf2-002d-4350-8165-c0555bc2edf9\" (UID: \"c587dbf2-002d-4350-8165-c0555bc2edf9\") " Dec 02 21:26:53 crc kubenswrapper[4807]: I1202 21:26:53.109049 4807 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c587dbf2-002d-4350-8165-c0555bc2edf9-host\") on node \"crc\" DevicePath \"\"" Dec 02 21:26:53 crc kubenswrapper[4807]: I1202 21:26:53.114932 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c587dbf2-002d-4350-8165-c0555bc2edf9-kube-api-access-pphpv" (OuterVolumeSpecName: "kube-api-access-pphpv") pod "c587dbf2-002d-4350-8165-c0555bc2edf9" (UID: "c587dbf2-002d-4350-8165-c0555bc2edf9"). InnerVolumeSpecName "kube-api-access-pphpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:26:53 crc kubenswrapper[4807]: I1202 21:26:53.210405 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pphpv\" (UniqueName: \"kubernetes.io/projected/c587dbf2-002d-4350-8165-c0555bc2edf9-kube-api-access-pphpv\") on node \"crc\" DevicePath \"\"" Dec 02 21:26:53 crc kubenswrapper[4807]: I1202 21:26:53.933876 4807 scope.go:117] "RemoveContainer" containerID="ddf9677736b273cf7e39537bfb022b473f48e3bf54102fb1f2d37ac3b81b53c2" Dec 02 21:26:53 crc kubenswrapper[4807]: I1202 21:26:53.933959 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hxrb/crc-debug-kfx8c" Dec 02 21:26:54 crc kubenswrapper[4807]: I1202 21:26:54.982999 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c587dbf2-002d-4350-8165-c0555bc2edf9" path="/var/lib/kubelet/pods/c587dbf2-002d-4350-8165-c0555bc2edf9/volumes" Dec 02 21:26:56 crc kubenswrapper[4807]: I1202 21:26:56.972660 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:26:56 crc kubenswrapper[4807]: E1202 21:26:56.973249 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:27:09 crc kubenswrapper[4807]: I1202 21:27:09.972143 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:27:11 crc kubenswrapper[4807]: I1202 21:27:11.179779 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"ea004cd87b9681bdd1d161f6f120895bdeed10d1a2a654222855ee3a5eb7c083"} Dec 02 21:27:19 crc kubenswrapper[4807]: I1202 21:27:19.532128 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5dff7976bd-s4t8d_0d4efb1f-6d37-4673-94fc-33623db07604/barbican-api/0.log" Dec 02 21:27:19 crc kubenswrapper[4807]: I1202 21:27:19.701854 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5dff7976bd-s4t8d_0d4efb1f-6d37-4673-94fc-33623db07604/barbican-api-log/0.log" Dec 02 21:27:19 crc kubenswrapper[4807]: I1202 21:27:19.789547 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76444866d4-7vv98_845bcc3a-0d65-4ba8-bbb0-6f95d4778851/barbican-keystone-listener/0.log" Dec 02 21:27:19 crc kubenswrapper[4807]: I1202 21:27:19.931318 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76444866d4-7vv98_845bcc3a-0d65-4ba8-bbb0-6f95d4778851/barbican-keystone-listener-log/0.log" Dec 02 21:27:19 crc kubenswrapper[4807]: I1202 21:27:19.955324 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-f5f4b885-lpj6r_926c4543-c7c5-41aa-a5ed-46035ee41498/barbican-worker/0.log" Dec 02 21:27:20 crc kubenswrapper[4807]: I1202 21:27:20.025152 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-f5f4b885-lpj6r_926c4543-c7c5-41aa-a5ed-46035ee41498/barbican-worker-log/0.log" Dec 02 21:27:20 crc kubenswrapper[4807]: I1202 21:27:20.201432 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-74b28_59148eea-351d-4d4c-ba60-e39e47372466/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:27:20 crc kubenswrapper[4807]: I1202 21:27:20.305353 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_10d90a40-6c28-4353-91b3-87e966ad1ac7/ceilometer-central-agent/0.log" Dec 02 21:27:20 crc kubenswrapper[4807]: I1202 21:27:20.424695 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_10d90a40-6c28-4353-91b3-87e966ad1ac7/ceilometer-notification-agent/0.log" Dec 02 21:27:20 crc kubenswrapper[4807]: I1202 21:27:20.457081 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_10d90a40-6c28-4353-91b3-87e966ad1ac7/proxy-httpd/0.log" Dec 02 21:27:20 crc kubenswrapper[4807]: I1202 21:27:20.472064 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_10d90a40-6c28-4353-91b3-87e966ad1ac7/sg-core/0.log" Dec 02 21:27:20 crc kubenswrapper[4807]: I1202 21:27:20.733277 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_06947941-0c96-4330-b2f7-bbc193dcdf61/cinder-api-log/0.log" Dec 02 21:27:20 crc kubenswrapper[4807]: I1202 21:27:20.785873 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_06947941-0c96-4330-b2f7-bbc193dcdf61/cinder-api/0.log" Dec 02 21:27:20 crc kubenswrapper[4807]: I1202 21:27:20.882613 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_4a7b23d9-c399-44ec-995e-54726ae83774/cinder-scheduler/0.log" Dec 02 21:27:21 crc kubenswrapper[4807]: I1202 21:27:21.002400 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_4a7b23d9-c399-44ec-995e-54726ae83774/probe/0.log" Dec 02 21:27:21 crc kubenswrapper[4807]: I1202 21:27:21.078432 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm_5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:27:21 crc kubenswrapper[4807]: I1202 21:27:21.233784 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dw249_d72aa681-f5b3-4192-aa10-a4b6fc8519b9/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:27:21 crc kubenswrapper[4807]: I1202 21:27:21.339461 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b865b64bc-72s4n_7cb6399e-e732-4865-9d51-5f15eb42c502/init/0.log" Dec 02 21:27:21 crc kubenswrapper[4807]: I1202 21:27:21.551098 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b865b64bc-72s4n_7cb6399e-e732-4865-9d51-5f15eb42c502/init/0.log" Dec 02 21:27:21 crc kubenswrapper[4807]: I1202 21:27:21.669463 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp_3399e62b-d5c6-4469-9507-75e4e922201e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:27:21 crc kubenswrapper[4807]: I1202 21:27:21.671513 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b865b64bc-72s4n_7cb6399e-e732-4865-9d51-5f15eb42c502/dnsmasq-dns/0.log" Dec 02 21:27:22 crc kubenswrapper[4807]: I1202 21:27:22.056065 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_72b4a2ac-3f2c-4064-bcd9-b40585699ab9/glance-httpd/0.log" Dec 02 21:27:22 crc kubenswrapper[4807]: I1202 21:27:22.120318 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_72b4a2ac-3f2c-4064-bcd9-b40585699ab9/glance-log/0.log" Dec 02 21:27:22 crc kubenswrapper[4807]: I1202 21:27:22.250868 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7b4380db-c1e5-4f3b-81f6-ae5a6d71119a/glance-httpd/0.log" Dec 02 21:27:22 crc kubenswrapper[4807]: I1202 21:27:22.334149 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7b4380db-c1e5-4f3b-81f6-ae5a6d71119a/glance-log/0.log" Dec 02 21:27:22 crc kubenswrapper[4807]: I1202 21:27:22.462778 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5cbfd7dcb-hzflv_f5570109-9e91-473c-8a41-47081ace3591/horizon/0.log" Dec 02 21:27:22 crc kubenswrapper[4807]: I1202 21:27:22.684028 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x_64a6a7a0-63cc-48bb-a936-21fbab3123e9/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:27:23 crc kubenswrapper[4807]: I1202 21:27:23.024501 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5cbfd7dcb-hzflv_f5570109-9e91-473c-8a41-47081ace3591/horizon-log/0.log" Dec 02 21:27:23 crc kubenswrapper[4807]: I1202 21:27:23.042084 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-5lbnt_0c2e1673-a8ae-401a-b874-d425c01fad63/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:27:23 crc kubenswrapper[4807]: I1202 21:27:23.336985 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29411821-qfgdj_1f152d73-b7a0-4142-8f65-2343fca9dc2e/keystone-cron/0.log" Dec 02 21:27:23 crc kubenswrapper[4807]: I1202 21:27:23.391628 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_a603da43-4e3b-4e75-8c4e-9e90908e2af4/kube-state-metrics/0.log" Dec 02 21:27:23 crc kubenswrapper[4807]: I1202 21:27:23.392764 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6d4cd98589-bnbn5_bcde4ab3-e62a-40bc-86b7-6d1c5e1af116/keystone-api/0.log" Dec 02 21:27:23 crc kubenswrapper[4807]: I1202 21:27:23.622201 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf_a22ccfdd-695f-49fe-9bd9-5f1109915c63/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:27:23 crc kubenswrapper[4807]: I1202 21:27:23.999389 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x_f2cd5e17-9097-447f-8fcf-7e95a2621845/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:27:24 crc kubenswrapper[4807]: I1202 21:27:24.021209 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5597979745-dn972_a98ea655-0ec2-4d0f-951a-57f5ee9f6df2/neutron-httpd/0.log" Dec 02 21:27:24 crc kubenswrapper[4807]: I1202 21:27:24.129776 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5597979745-dn972_a98ea655-0ec2-4d0f-951a-57f5ee9f6df2/neutron-api/0.log" Dec 02 21:27:24 crc kubenswrapper[4807]: I1202 21:27:24.701081 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a/nova-cell0-conductor-conductor/0.log" Dec 02 21:27:25 crc kubenswrapper[4807]: I1202 21:27:25.043969 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c5860289-2a92-47f1-855c-399a8c590f7f/nova-cell1-conductor-conductor/0.log" Dec 02 21:27:25 crc kubenswrapper[4807]: I1202 21:27:25.341161 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_fc1416d8-8665-48f9-ad43-b2e16b6a5ecb/nova-cell1-novncproxy-novncproxy/0.log" Dec 02 21:27:25 crc kubenswrapper[4807]: I1202 21:27:25.351795 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9caa6001-4e75-4042-998f-9f00f49ef173/nova-api-log/0.log" Dec 02 21:27:25 crc kubenswrapper[4807]: I1202 21:27:25.622846 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9caa6001-4e75-4042-998f-9f00f49ef173/nova-api-api/0.log" Dec 02 21:27:26 crc kubenswrapper[4807]: I1202 21:27:26.029190 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_96fcc760-d492-4e5b-8d31-de6c7f49b47f/nova-metadata-log/0.log" Dec 02 21:27:26 crc kubenswrapper[4807]: I1202 21:27:26.080558 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-48b5r_c7cb6b66-35b2-477f-8d6a-3037a6931797/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:27:26 crc kubenswrapper[4807]: I1202 21:27:26.568349 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_96ba9206-497c-4cd1-a16d-436d2ba285a7/mysql-bootstrap/0.log" Dec 02 21:27:26 crc kubenswrapper[4807]: I1202 21:27:26.648780 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_3db3f424-7a28-419f-b5e1-0dec9279d417/nova-scheduler-scheduler/0.log" Dec 02 21:27:26 crc kubenswrapper[4807]: I1202 21:27:26.807946 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_96ba9206-497c-4cd1-a16d-436d2ba285a7/galera/0.log" Dec 02 21:27:26 crc kubenswrapper[4807]: I1202 21:27:26.846102 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_96ba9206-497c-4cd1-a16d-436d2ba285a7/mysql-bootstrap/0.log" Dec 02 21:27:27 crc kubenswrapper[4807]: I1202 21:27:27.540070 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_df8d83f3-6675-416b-a039-2aafac45fe18/mysql-bootstrap/0.log" Dec 02 21:27:27 crc kubenswrapper[4807]: I1202 21:27:27.775986 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_df8d83f3-6675-416b-a039-2aafac45fe18/mysql-bootstrap/0.log" Dec 02 21:27:27 crc kubenswrapper[4807]: I1202 21:27:27.803891 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_df8d83f3-6675-416b-a039-2aafac45fe18/galera/0.log" Dec 02 21:27:28 crc kubenswrapper[4807]: I1202 21:27:28.031105 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_0f6c2f22-8527-4428-a503-7aedd5635e6b/openstackclient/0.log" Dec 02 21:27:28 crc kubenswrapper[4807]: I1202 21:27:28.137236 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_96fcc760-d492-4e5b-8d31-de6c7f49b47f/nova-metadata-metadata/0.log" Dec 02 21:27:28 crc kubenswrapper[4807]: I1202 21:27:28.186424 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-kstxp_80844aa2-667c-4b2d-a55a-e5fa2cd3dd85/ovn-controller/0.log" Dec 02 21:27:28 crc kubenswrapper[4807]: I1202 21:27:28.623793 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pxxrz_f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd/ovsdb-server-init/0.log" Dec 02 21:27:28 crc kubenswrapper[4807]: I1202 21:27:28.658503 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-c2pcc_0ee2869c-161d-442b-81a3-b3790ab8cdfe/openstack-network-exporter/0.log" Dec 02 21:27:28 crc kubenswrapper[4807]: I1202 21:27:28.843879 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pxxrz_f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd/ovsdb-server-init/0.log" Dec 02 21:27:28 crc kubenswrapper[4807]: I1202 21:27:28.898196 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pxxrz_f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd/ovsdb-server/0.log" Dec 02 21:27:28 crc kubenswrapper[4807]: I1202 21:27:28.913885 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pxxrz_f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd/ovs-vswitchd/0.log" Dec 02 21:27:29 crc kubenswrapper[4807]: I1202 21:27:29.106870 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-9q5fk_8b4b56e1-9070-4a42-beff-c3d9324e820c/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:27:29 crc kubenswrapper[4807]: I1202 21:27:29.198461 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_06539675-3505-4f57-bdfd-54ccdb96d90a/openstack-network-exporter/0.log" Dec 02 21:27:29 crc kubenswrapper[4807]: I1202 21:27:29.252378 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_06539675-3505-4f57-bdfd-54ccdb96d90a/ovn-northd/0.log" Dec 02 21:27:29 crc kubenswrapper[4807]: I1202 21:27:29.425556 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7a63890f-30c8-4538-903a-121488dba6bb/ovsdbserver-nb/0.log" Dec 02 21:27:29 crc kubenswrapper[4807]: I1202 21:27:29.496357 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7a63890f-30c8-4538-903a-121488dba6bb/openstack-network-exporter/0.log" Dec 02 21:27:29 crc kubenswrapper[4807]: I1202 21:27:29.702874 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_47753529-368a-4c5c-a3f8-27ffd55e41d1/openstack-network-exporter/0.log" Dec 02 21:27:29 crc kubenswrapper[4807]: I1202 21:27:29.711411 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_47753529-368a-4c5c-a3f8-27ffd55e41d1/ovsdbserver-sb/0.log" Dec 02 21:27:29 crc kubenswrapper[4807]: I1202 21:27:29.929303 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-844d66f984-gvswh_cd577cf0-d4de-4a57-9254-8a7bf61aa686/placement-api/0.log" Dec 02 21:27:30 crc kubenswrapper[4807]: I1202 21:27:30.113533 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_35ee0b99-6360-4b6d-bc80-8b420b1054c0/init-config-reloader/0.log" Dec 02 21:27:30 crc kubenswrapper[4807]: I1202 21:27:30.169754 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-844d66f984-gvswh_cd577cf0-d4de-4a57-9254-8a7bf61aa686/placement-log/0.log" Dec 02 21:27:30 crc kubenswrapper[4807]: I1202 21:27:30.280600 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_35ee0b99-6360-4b6d-bc80-8b420b1054c0/config-reloader/0.log" Dec 02 21:27:30 crc kubenswrapper[4807]: I1202 21:27:30.290681 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_35ee0b99-6360-4b6d-bc80-8b420b1054c0/init-config-reloader/0.log" Dec 02 21:27:30 crc kubenswrapper[4807]: I1202 21:27:30.405109 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_35ee0b99-6360-4b6d-bc80-8b420b1054c0/prometheus/0.log" Dec 02 21:27:30 crc kubenswrapper[4807]: I1202 21:27:30.426465 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_35ee0b99-6360-4b6d-bc80-8b420b1054c0/thanos-sidecar/0.log" Dec 02 21:27:30 crc kubenswrapper[4807]: I1202 21:27:30.524381 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_748ead81-bff5-4a69-9398-4e3c91be5979/setup-container/0.log" Dec 02 21:27:30 crc kubenswrapper[4807]: I1202 21:27:30.689675 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_748ead81-bff5-4a69-9398-4e3c91be5979/setup-container/0.log" Dec 02 21:27:30 crc kubenswrapper[4807]: I1202 21:27:30.793780 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_748ead81-bff5-4a69-9398-4e3c91be5979/rabbitmq/0.log" Dec 02 21:27:30 crc kubenswrapper[4807]: I1202 21:27:30.884496 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8593e062-85d8-4f22-88b4-eb7cf5654859/setup-container/0.log" Dec 02 21:27:31 crc kubenswrapper[4807]: I1202 21:27:31.115987 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8593e062-85d8-4f22-88b4-eb7cf5654859/setup-container/0.log" Dec 02 21:27:31 crc kubenswrapper[4807]: I1202 21:27:31.139937 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8593e062-85d8-4f22-88b4-eb7cf5654859/rabbitmq/0.log" Dec 02 21:27:31 crc kubenswrapper[4807]: I1202 21:27:31.257659 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6_89a829f1-32e4-4b5b-ba48-196916b1da6f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:27:31 crc kubenswrapper[4807]: I1202 21:27:31.690954 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-89mf9_2548dec0-ad57-411d-891a-0b847b25a4bb/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:27:31 crc kubenswrapper[4807]: I1202 21:27:31.718188 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk_8483568b-e11d-4aea-8fcb-1925d2e64fa2/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:27:31 crc kubenswrapper[4807]: I1202 21:27:31.902561 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-pzgnd_94998bd3-5f5b-47cd-b14c-39e55cb78eaa/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:27:31 crc kubenswrapper[4807]: I1202 21:27:31.961598 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-8d2sc_d0bb0a20-106f-412e-8ba3-b218bacdadf5/ssh-known-hosts-edpm-deployment/0.log" Dec 02 21:27:32 crc kubenswrapper[4807]: I1202 21:27:32.224184 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9c96f4455-bvlsr_60cf7565-bc2c-469d-a0ad-400e95d69528/proxy-server/0.log" Dec 02 21:27:32 crc kubenswrapper[4807]: I1202 21:27:32.344137 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9c96f4455-bvlsr_60cf7565-bc2c-469d-a0ad-400e95d69528/proxy-httpd/0.log" Dec 02 21:27:32 crc kubenswrapper[4807]: I1202 21:27:32.347451 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-msbf5_a3dd1529-ab40-486b-8458-e3d1afc9a0e2/swift-ring-rebalance/0.log" Dec 02 21:27:32 crc kubenswrapper[4807]: I1202 21:27:32.572202 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/account-replicator/0.log" Dec 02 21:27:32 crc kubenswrapper[4807]: I1202 21:27:32.591714 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/account-auditor/0.log" Dec 02 21:27:32 crc kubenswrapper[4807]: I1202 21:27:32.654464 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/account-reaper/0.log" Dec 02 21:27:32 crc kubenswrapper[4807]: I1202 21:27:32.799528 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/account-server/0.log" Dec 02 21:27:32 crc kubenswrapper[4807]: I1202 21:27:32.872349 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/container-auditor/0.log" Dec 02 21:27:32 crc kubenswrapper[4807]: I1202 21:27:32.914555 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/container-server/0.log" Dec 02 21:27:32 crc kubenswrapper[4807]: I1202 21:27:32.952413 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/container-replicator/0.log" Dec 02 21:27:33 crc kubenswrapper[4807]: I1202 21:27:33.055972 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/container-updater/0.log" Dec 02 21:27:33 crc kubenswrapper[4807]: I1202 21:27:33.117233 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/object-auditor/0.log" Dec 02 21:27:33 crc kubenswrapper[4807]: I1202 21:27:33.154131 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/object-expirer/0.log" Dec 02 21:27:33 crc kubenswrapper[4807]: I1202 21:27:33.233873 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/object-replicator/0.log" Dec 02 21:27:33 crc kubenswrapper[4807]: I1202 21:27:33.282760 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/object-server/0.log" Dec 02 21:27:33 crc kubenswrapper[4807]: I1202 21:27:33.307258 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/object-updater/0.log" Dec 02 21:27:33 crc kubenswrapper[4807]: I1202 21:27:33.377555 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/rsync/0.log" Dec 02 21:27:33 crc kubenswrapper[4807]: I1202 21:27:33.493347 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_4225655b-c174-479e-a740-b768c9801287/memcached/0.log" Dec 02 21:27:33 crc kubenswrapper[4807]: I1202 21:27:33.500605 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/swift-recon-cron/0.log" Dec 02 21:27:33 crc kubenswrapper[4807]: I1202 21:27:33.626928 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-s86vg_c5be8c89-b466-4c89-aecd-548b6d5d19ae/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:27:33 crc kubenswrapper[4807]: I1202 21:27:33.721488 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e6826607-5100-439e-b82d-224b312a6faa/tempest-tests-tempest-tests-runner/0.log" Dec 02 21:27:33 crc kubenswrapper[4807]: I1202 21:27:33.760752 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_f0d7af3c-3d32-47d8-ab5b-3293bd5e26eb/test-operator-logs-container/0.log" Dec 02 21:27:33 crc kubenswrapper[4807]: I1202 21:27:33.909796 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj_f741a266-b127-46cc-8304-9aedd57f07b5/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:27:34 crc kubenswrapper[4807]: I1202 21:27:34.505992 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_1d68c545-0435-4f66-a351-3ccba6fa68a3/watcher-applier/0.log" Dec 02 21:27:34 crc kubenswrapper[4807]: I1202 21:27:34.806535 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_bc49afa3-486b-481f-bb06-5b9bb2701021/watcher-api-log/0.log" Dec 02 21:27:35 crc kubenswrapper[4807]: I1202 21:27:35.264413 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_05674c10-e8c2-4ab8-9d80-185c9b814c9c/watcher-decision-engine/0.log" Dec 02 21:27:36 crc kubenswrapper[4807]: I1202 21:27:36.732285 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_bc49afa3-486b-481f-bb06-5b9bb2701021/watcher-api/0.log" Dec 02 21:28:04 crc kubenswrapper[4807]: I1202 21:28:04.209301 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd_94bd7c70-1bb7-4b0e-816e-5be2df3641da/util/0.log" Dec 02 21:28:04 crc kubenswrapper[4807]: I1202 21:28:04.418761 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd_94bd7c70-1bb7-4b0e-816e-5be2df3641da/pull/0.log" Dec 02 21:28:04 crc kubenswrapper[4807]: I1202 21:28:04.435569 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd_94bd7c70-1bb7-4b0e-816e-5be2df3641da/pull/0.log" Dec 02 21:28:04 crc kubenswrapper[4807]: I1202 21:28:04.452003 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd_94bd7c70-1bb7-4b0e-816e-5be2df3641da/util/0.log" Dec 02 21:28:04 crc kubenswrapper[4807]: I1202 21:28:04.626765 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd_94bd7c70-1bb7-4b0e-816e-5be2df3641da/extract/0.log" Dec 02 21:28:04 crc kubenswrapper[4807]: I1202 21:28:04.628906 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd_94bd7c70-1bb7-4b0e-816e-5be2df3641da/pull/0.log" Dec 02 21:28:04 crc kubenswrapper[4807]: I1202 21:28:04.633636 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd_94bd7c70-1bb7-4b0e-816e-5be2df3641da/util/0.log" Dec 02 21:28:04 crc kubenswrapper[4807]: I1202 21:28:04.795646 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-r9jtf_7c007dd6-7efa-47c1-af56-ce0bf8fd6f37/kube-rbac-proxy/0.log" Dec 02 21:28:04 crc kubenswrapper[4807]: I1202 21:28:04.884935 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-r9jtf_7c007dd6-7efa-47c1-af56-ce0bf8fd6f37/manager/0.log" Dec 02 21:28:04 crc kubenswrapper[4807]: I1202 21:28:04.911477 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-g2lf7_3982be8e-b5d2-4795-9312-f3ba8466209c/kube-rbac-proxy/0.log" Dec 02 21:28:05 crc kubenswrapper[4807]: I1202 21:28:05.017914 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-g2lf7_3982be8e-b5d2-4795-9312-f3ba8466209c/manager/0.log" Dec 02 21:28:05 crc kubenswrapper[4807]: I1202 21:28:05.126123 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-p2b2h_357ab26f-5ac4-46a1-b8f3-89db969b4082/manager/0.log" Dec 02 21:28:05 crc kubenswrapper[4807]: I1202 21:28:05.127994 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-p2b2h_357ab26f-5ac4-46a1-b8f3-89db969b4082/kube-rbac-proxy/0.log" Dec 02 21:28:05 crc kubenswrapper[4807]: I1202 21:28:05.293518 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-mss4c_cfb9049e-2275-4d96-9131-29bb4def714b/kube-rbac-proxy/0.log" Dec 02 21:28:05 crc kubenswrapper[4807]: I1202 21:28:05.376319 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-mss4c_cfb9049e-2275-4d96-9131-29bb4def714b/manager/0.log" Dec 02 21:28:05 crc kubenswrapper[4807]: I1202 21:28:05.448212 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-kq246_3c3d38aa-3600-41f6-97b3-e3699796526e/kube-rbac-proxy/0.log" Dec 02 21:28:05 crc kubenswrapper[4807]: I1202 21:28:05.508099 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-kq246_3c3d38aa-3600-41f6-97b3-e3699796526e/manager/0.log" Dec 02 21:28:05 crc kubenswrapper[4807]: I1202 21:28:05.577711 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-6mvz2_f376fe60-0cdf-4b30-ab61-80178d738ea4/kube-rbac-proxy/0.log" Dec 02 21:28:05 crc kubenswrapper[4807]: I1202 21:28:05.664455 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-6mvz2_f376fe60-0cdf-4b30-ab61-80178d738ea4/manager/0.log" Dec 02 21:28:05 crc kubenswrapper[4807]: I1202 21:28:05.749640 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-dn7v2_d7712aec-0995-489a-8cee-7e68fbf130df/kube-rbac-proxy/0.log" Dec 02 21:28:05 crc kubenswrapper[4807]: I1202 21:28:05.964214 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-tz65v_5ceaf50b-92b7-4069-b9b8-660e90c55d97/kube-rbac-proxy/0.log" Dec 02 21:28:05 crc kubenswrapper[4807]: I1202 21:28:05.977063 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-tz65v_5ceaf50b-92b7-4069-b9b8-660e90c55d97/manager/0.log" Dec 02 21:28:05 crc kubenswrapper[4807]: I1202 21:28:05.978422 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-dn7v2_d7712aec-0995-489a-8cee-7e68fbf130df/manager/0.log" Dec 02 21:28:06 crc kubenswrapper[4807]: I1202 21:28:06.182054 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-gmwq8_9198e60c-4301-40b6-9d1b-3e91a2f10fa5/manager/0.log" Dec 02 21:28:06 crc kubenswrapper[4807]: I1202 21:28:06.185318 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-gmwq8_9198e60c-4301-40b6-9d1b-3e91a2f10fa5/kube-rbac-proxy/0.log" Dec 02 21:28:06 crc kubenswrapper[4807]: I1202 21:28:06.281512 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-wzdmc_fa1a5516-5c0d-4d4e-b052-d9301371a2d3/kube-rbac-proxy/0.log" Dec 02 21:28:06 crc kubenswrapper[4807]: I1202 21:28:06.367385 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-52v8m_490b3442-f4b4-493d-824a-67e370ac26f9/kube-rbac-proxy/0.log" Dec 02 21:28:06 crc kubenswrapper[4807]: I1202 21:28:06.383682 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-wzdmc_fa1a5516-5c0d-4d4e-b052-d9301371a2d3/manager/0.log" Dec 02 21:28:06 crc kubenswrapper[4807]: I1202 21:28:06.501273 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-52v8m_490b3442-f4b4-493d-824a-67e370ac26f9/manager/0.log" Dec 02 21:28:06 crc kubenswrapper[4807]: I1202 21:28:06.592822 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-gd4bj_7cccb577-4849-4e1c-b38e-669f7658eb2e/kube-rbac-proxy/0.log" Dec 02 21:28:06 crc kubenswrapper[4807]: I1202 21:28:06.610667 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-gd4bj_7cccb577-4849-4e1c-b38e-669f7658eb2e/manager/0.log" Dec 02 21:28:06 crc kubenswrapper[4807]: I1202 21:28:06.785085 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-g8559_6511dc8d-00b4-4937-a330-0f5cf9c06fdd/kube-rbac-proxy/0.log" Dec 02 21:28:06 crc kubenswrapper[4807]: I1202 21:28:06.868131 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-g8559_6511dc8d-00b4-4937-a330-0f5cf9c06fdd/manager/0.log" Dec 02 21:28:06 crc kubenswrapper[4807]: I1202 21:28:06.936388 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-9g547_1f4141f5-ba14-4c49-b114-07e5d506b255/kube-rbac-proxy/0.log" Dec 02 21:28:07 crc kubenswrapper[4807]: I1202 21:28:07.003649 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-9g547_1f4141f5-ba14-4c49-b114-07e5d506b255/manager/0.log" Dec 02 21:28:07 crc kubenswrapper[4807]: I1202 21:28:07.041258 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk_e0e35837-6389-4e86-b8c5-46105f1332cb/kube-rbac-proxy/0.log" Dec 02 21:28:07 crc kubenswrapper[4807]: I1202 21:28:07.150989 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk_e0e35837-6389-4e86-b8c5-46105f1332cb/manager/0.log" Dec 02 21:28:07 crc kubenswrapper[4807]: I1202 21:28:07.509007 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-84f48485bd-tsr5l_9f824d2a-934d-4e25-95dd-6323a038f878/operator/0.log" Dec 02 21:28:07 crc kubenswrapper[4807]: I1202 21:28:07.559506 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-h8bl8_1f433a73-95ba-41cc-9f6e-3c6b26dd5e50/registry-server/0.log" Dec 02 21:28:07 crc kubenswrapper[4807]: I1202 21:28:07.841407 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-4qmqw_959355af-f6bd-492c-af58-9a7378224225/kube-rbac-proxy/0.log" Dec 02 21:28:07 crc kubenswrapper[4807]: I1202 21:28:07.869676 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-4qmqw_959355af-f6bd-492c-af58-9a7378224225/manager/0.log" Dec 02 21:28:08 crc kubenswrapper[4807]: I1202 21:28:08.017362 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-nh4hg_b2ef8498-337b-40c6-b122-19863c876321/kube-rbac-proxy/0.log" Dec 02 21:28:08 crc kubenswrapper[4807]: I1202 21:28:08.094682 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-nh4hg_b2ef8498-337b-40c6-b122-19863c876321/manager/0.log" Dec 02 21:28:08 crc kubenswrapper[4807]: I1202 21:28:08.242955 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-t8rj2_3035086c-2661-4720-97b1-df4d0cd891a6/operator/0.log" Dec 02 21:28:08 crc kubenswrapper[4807]: I1202 21:28:08.336233 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-7pqn2_b53c3dbf-2380-4dff-9b18-a2207efcce60/kube-rbac-proxy/0.log" Dec 02 21:28:08 crc kubenswrapper[4807]: I1202 21:28:08.409688 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-7pqn2_b53c3dbf-2380-4dff-9b18-a2207efcce60/manager/0.log" Dec 02 21:28:08 crc kubenswrapper[4807]: I1202 21:28:08.544454 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-57fb6dd487-lffvh_4039c119-ee84-4043-8892-733499aabdc5/manager/0.log" Dec 02 21:28:08 crc kubenswrapper[4807]: I1202 21:28:08.591378 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-tzhtf_8dd00692-0728-47cf-b8fa-ab812b11ec8f/kube-rbac-proxy/0.log" Dec 02 21:28:08 crc kubenswrapper[4807]: I1202 21:28:08.710346 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-qrthg_733f7038-d2b9-4047-8ee9-3ad49a55729d/kube-rbac-proxy/0.log" Dec 02 21:28:08 crc kubenswrapper[4807]: I1202 21:28:08.784411 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-qrthg_733f7038-d2b9-4047-8ee9-3ad49a55729d/manager/0.log" Dec 02 21:28:08 crc kubenswrapper[4807]: I1202 21:28:08.811332 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-tzhtf_8dd00692-0728-47cf-b8fa-ab812b11ec8f/manager/0.log" Dec 02 21:28:08 crc kubenswrapper[4807]: I1202 21:28:08.920613 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-58888ff59d-cwws5_2a376983-2f33-465c-9781-391b67941e21/kube-rbac-proxy/0.log" Dec 02 21:28:08 crc kubenswrapper[4807]: I1202 21:28:08.974225 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-58888ff59d-cwws5_2a376983-2f33-465c-9781-391b67941e21/manager/0.log" Dec 02 21:28:22 crc kubenswrapper[4807]: I1202 21:28:22.049048 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g9wts"] Dec 02 21:28:22 crc kubenswrapper[4807]: E1202 21:28:22.051082 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c587dbf2-002d-4350-8165-c0555bc2edf9" containerName="container-00" Dec 02 21:28:22 crc kubenswrapper[4807]: I1202 21:28:22.051175 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c587dbf2-002d-4350-8165-c0555bc2edf9" containerName="container-00" Dec 02 21:28:22 crc kubenswrapper[4807]: I1202 21:28:22.051486 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="c587dbf2-002d-4350-8165-c0555bc2edf9" containerName="container-00" Dec 02 21:28:22 crc kubenswrapper[4807]: I1202 21:28:22.053015 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:22 crc kubenswrapper[4807]: I1202 21:28:22.088248 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g9wts"] Dec 02 21:28:22 crc kubenswrapper[4807]: I1202 21:28:22.175727 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-utilities\") pod \"redhat-operators-g9wts\" (UID: \"b74e4e4c-baac-4869-b4dd-c4589e30d0e5\") " pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:22 crc kubenswrapper[4807]: I1202 21:28:22.175809 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-catalog-content\") pod \"redhat-operators-g9wts\" (UID: \"b74e4e4c-baac-4869-b4dd-c4589e30d0e5\") " pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:22 crc kubenswrapper[4807]: I1202 21:28:22.175966 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp84j\" (UniqueName: \"kubernetes.io/projected/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-kube-api-access-qp84j\") pod \"redhat-operators-g9wts\" (UID: \"b74e4e4c-baac-4869-b4dd-c4589e30d0e5\") " pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:22 crc kubenswrapper[4807]: I1202 21:28:22.278468 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-catalog-content\") pod \"redhat-operators-g9wts\" (UID: \"b74e4e4c-baac-4869-b4dd-c4589e30d0e5\") " pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:22 crc kubenswrapper[4807]: I1202 21:28:22.278533 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp84j\" (UniqueName: \"kubernetes.io/projected/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-kube-api-access-qp84j\") pod \"redhat-operators-g9wts\" (UID: \"b74e4e4c-baac-4869-b4dd-c4589e30d0e5\") " pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:22 crc kubenswrapper[4807]: I1202 21:28:22.278680 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-utilities\") pod \"redhat-operators-g9wts\" (UID: \"b74e4e4c-baac-4869-b4dd-c4589e30d0e5\") " pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:22 crc kubenswrapper[4807]: I1202 21:28:22.279332 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-utilities\") pod \"redhat-operators-g9wts\" (UID: \"b74e4e4c-baac-4869-b4dd-c4589e30d0e5\") " pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:22 crc kubenswrapper[4807]: I1202 21:28:22.279498 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-catalog-content\") pod \"redhat-operators-g9wts\" (UID: \"b74e4e4c-baac-4869-b4dd-c4589e30d0e5\") " pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:22 crc kubenswrapper[4807]: I1202 21:28:22.304518 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp84j\" (UniqueName: \"kubernetes.io/projected/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-kube-api-access-qp84j\") pod \"redhat-operators-g9wts\" (UID: \"b74e4e4c-baac-4869-b4dd-c4589e30d0e5\") " pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:22 crc kubenswrapper[4807]: I1202 21:28:22.374183 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:23 crc kubenswrapper[4807]: I1202 21:28:23.595016 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g9wts"] Dec 02 21:28:23 crc kubenswrapper[4807]: I1202 21:28:23.896153 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9wts" event={"ID":"b74e4e4c-baac-4869-b4dd-c4589e30d0e5","Type":"ContainerStarted","Data":"45741cf550521b2d67d175302e2776a63fe0845ca152431a07abd30855257a2b"} Dec 02 21:28:23 crc kubenswrapper[4807]: I1202 21:28:23.896573 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9wts" event={"ID":"b74e4e4c-baac-4869-b4dd-c4589e30d0e5","Type":"ContainerStarted","Data":"03113a6013d7ce3bddde64cdf2c83f513b1fd6ea60a0008b1ebb20dbb7fb63d7"} Dec 02 21:28:24 crc kubenswrapper[4807]: I1202 21:28:24.907236 4807 generic.go:334] "Generic (PLEG): container finished" podID="b74e4e4c-baac-4869-b4dd-c4589e30d0e5" containerID="45741cf550521b2d67d175302e2776a63fe0845ca152431a07abd30855257a2b" exitCode=0 Dec 02 21:28:24 crc kubenswrapper[4807]: I1202 21:28:24.907297 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9wts" event={"ID":"b74e4e4c-baac-4869-b4dd-c4589e30d0e5","Type":"ContainerDied","Data":"45741cf550521b2d67d175302e2776a63fe0845ca152431a07abd30855257a2b"} Dec 02 21:28:24 crc kubenswrapper[4807]: I1202 21:28:24.910841 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 21:28:26 crc kubenswrapper[4807]: I1202 21:28:26.936124 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9wts" event={"ID":"b74e4e4c-baac-4869-b4dd-c4589e30d0e5","Type":"ContainerStarted","Data":"eb60875338ce65224e4ef7e5da56c8219ba82877f04e661ed86832cbf101ecd8"} Dec 02 21:28:27 crc kubenswrapper[4807]: I1202 21:28:27.947150 4807 generic.go:334] "Generic (PLEG): container finished" podID="b74e4e4c-baac-4869-b4dd-c4589e30d0e5" containerID="eb60875338ce65224e4ef7e5da56c8219ba82877f04e661ed86832cbf101ecd8" exitCode=0 Dec 02 21:28:27 crc kubenswrapper[4807]: I1202 21:28:27.947196 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9wts" event={"ID":"b74e4e4c-baac-4869-b4dd-c4589e30d0e5","Type":"ContainerDied","Data":"eb60875338ce65224e4ef7e5da56c8219ba82877f04e661ed86832cbf101ecd8"} Dec 02 21:28:29 crc kubenswrapper[4807]: I1202 21:28:29.973343 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9wts" event={"ID":"b74e4e4c-baac-4869-b4dd-c4589e30d0e5","Type":"ContainerStarted","Data":"45d76af6acce6ef25d909246339757015a447e6902ae1c66749b7b2d5a68e514"} Dec 02 21:28:32 crc kubenswrapper[4807]: I1202 21:28:32.067879 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-n7sh5_617adf8b-8ca9-4578-8c24-8f6b22713567/control-plane-machine-set-operator/0.log" Dec 02 21:28:32 crc kubenswrapper[4807]: I1202 21:28:32.294216 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7c8mj_3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8/kube-rbac-proxy/0.log" Dec 02 21:28:32 crc kubenswrapper[4807]: I1202 21:28:32.327264 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7c8mj_3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8/machine-api-operator/0.log" Dec 02 21:28:32 crc kubenswrapper[4807]: I1202 21:28:32.375182 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:32 crc kubenswrapper[4807]: I1202 21:28:32.375421 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:33 crc kubenswrapper[4807]: I1202 21:28:33.434208 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g9wts" podUID="b74e4e4c-baac-4869-b4dd-c4589e30d0e5" containerName="registry-server" probeResult="failure" output=< Dec 02 21:28:33 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Dec 02 21:28:33 crc kubenswrapper[4807]: > Dec 02 21:28:42 crc kubenswrapper[4807]: I1202 21:28:42.435518 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:42 crc kubenswrapper[4807]: I1202 21:28:42.468395 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g9wts" podStartSLOduration=16.461293407 podStartE2EDuration="20.468372968s" podCreationTimestamp="2025-12-02 21:28:22 +0000 UTC" firstStartedPulling="2025-12-02 21:28:24.910561234 +0000 UTC m=+5440.211468729" lastFinishedPulling="2025-12-02 21:28:28.917640795 +0000 UTC m=+5444.218548290" observedRunningTime="2025-12-02 21:28:29.993510408 +0000 UTC m=+5445.294417923" watchObservedRunningTime="2025-12-02 21:28:42.468372968 +0000 UTC m=+5457.769280463" Dec 02 21:28:42 crc kubenswrapper[4807]: I1202 21:28:42.490995 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:42 crc kubenswrapper[4807]: I1202 21:28:42.689038 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g9wts"] Dec 02 21:28:44 crc kubenswrapper[4807]: I1202 21:28:44.123707 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g9wts" podUID="b74e4e4c-baac-4869-b4dd-c4589e30d0e5" containerName="registry-server" containerID="cri-o://45d76af6acce6ef25d909246339757015a447e6902ae1c66749b7b2d5a68e514" gracePeriod=2 Dec 02 21:28:44 crc kubenswrapper[4807]: I1202 21:28:44.603532 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:44 crc kubenswrapper[4807]: I1202 21:28:44.765428 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-catalog-content\") pod \"b74e4e4c-baac-4869-b4dd-c4589e30d0e5\" (UID: \"b74e4e4c-baac-4869-b4dd-c4589e30d0e5\") " Dec 02 21:28:44 crc kubenswrapper[4807]: I1202 21:28:44.765820 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-utilities\") pod \"b74e4e4c-baac-4869-b4dd-c4589e30d0e5\" (UID: \"b74e4e4c-baac-4869-b4dd-c4589e30d0e5\") " Dec 02 21:28:44 crc kubenswrapper[4807]: I1202 21:28:44.765940 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp84j\" (UniqueName: \"kubernetes.io/projected/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-kube-api-access-qp84j\") pod \"b74e4e4c-baac-4869-b4dd-c4589e30d0e5\" (UID: \"b74e4e4c-baac-4869-b4dd-c4589e30d0e5\") " Dec 02 21:28:44 crc kubenswrapper[4807]: I1202 21:28:44.766983 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-utilities" (OuterVolumeSpecName: "utilities") pod "b74e4e4c-baac-4869-b4dd-c4589e30d0e5" (UID: "b74e4e4c-baac-4869-b4dd-c4589e30d0e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:28:44 crc kubenswrapper[4807]: I1202 21:28:44.775969 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-kube-api-access-qp84j" (OuterVolumeSpecName: "kube-api-access-qp84j") pod "b74e4e4c-baac-4869-b4dd-c4589e30d0e5" (UID: "b74e4e4c-baac-4869-b4dd-c4589e30d0e5"). InnerVolumeSpecName "kube-api-access-qp84j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:28:44 crc kubenswrapper[4807]: I1202 21:28:44.868527 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 21:28:44 crc kubenswrapper[4807]: I1202 21:28:44.868594 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp84j\" (UniqueName: \"kubernetes.io/projected/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-kube-api-access-qp84j\") on node \"crc\" DevicePath \"\"" Dec 02 21:28:44 crc kubenswrapper[4807]: I1202 21:28:44.891265 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b74e4e4c-baac-4869-b4dd-c4589e30d0e5" (UID: "b74e4e4c-baac-4869-b4dd-c4589e30d0e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:28:44 crc kubenswrapper[4807]: I1202 21:28:44.970857 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74e4e4c-baac-4869-b4dd-c4589e30d0e5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 21:28:45 crc kubenswrapper[4807]: I1202 21:28:45.144460 4807 generic.go:334] "Generic (PLEG): container finished" podID="b74e4e4c-baac-4869-b4dd-c4589e30d0e5" containerID="45d76af6acce6ef25d909246339757015a447e6902ae1c66749b7b2d5a68e514" exitCode=0 Dec 02 21:28:45 crc kubenswrapper[4807]: I1202 21:28:45.144518 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g9wts" Dec 02 21:28:45 crc kubenswrapper[4807]: I1202 21:28:45.144531 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9wts" event={"ID":"b74e4e4c-baac-4869-b4dd-c4589e30d0e5","Type":"ContainerDied","Data":"45d76af6acce6ef25d909246339757015a447e6902ae1c66749b7b2d5a68e514"} Dec 02 21:28:45 crc kubenswrapper[4807]: I1202 21:28:45.144577 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g9wts" event={"ID":"b74e4e4c-baac-4869-b4dd-c4589e30d0e5","Type":"ContainerDied","Data":"03113a6013d7ce3bddde64cdf2c83f513b1fd6ea60a0008b1ebb20dbb7fb63d7"} Dec 02 21:28:45 crc kubenswrapper[4807]: I1202 21:28:45.144628 4807 scope.go:117] "RemoveContainer" containerID="45d76af6acce6ef25d909246339757015a447e6902ae1c66749b7b2d5a68e514" Dec 02 21:28:45 crc kubenswrapper[4807]: I1202 21:28:45.172960 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g9wts"] Dec 02 21:28:45 crc kubenswrapper[4807]: I1202 21:28:45.181442 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g9wts"] Dec 02 21:28:45 crc kubenswrapper[4807]: I1202 21:28:45.184788 4807 scope.go:117] "RemoveContainer" containerID="eb60875338ce65224e4ef7e5da56c8219ba82877f04e661ed86832cbf101ecd8" Dec 02 21:28:46 crc kubenswrapper[4807]: I1202 21:28:46.014055 4807 scope.go:117] "RemoveContainer" containerID="45741cf550521b2d67d175302e2776a63fe0845ca152431a07abd30855257a2b" Dec 02 21:28:46 crc kubenswrapper[4807]: I1202 21:28:46.086185 4807 scope.go:117] "RemoveContainer" containerID="45d76af6acce6ef25d909246339757015a447e6902ae1c66749b7b2d5a68e514" Dec 02 21:28:46 crc kubenswrapper[4807]: E1202 21:28:46.086679 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45d76af6acce6ef25d909246339757015a447e6902ae1c66749b7b2d5a68e514\": container with ID starting with 45d76af6acce6ef25d909246339757015a447e6902ae1c66749b7b2d5a68e514 not found: ID does not exist" containerID="45d76af6acce6ef25d909246339757015a447e6902ae1c66749b7b2d5a68e514" Dec 02 21:28:46 crc kubenswrapper[4807]: I1202 21:28:46.086750 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d76af6acce6ef25d909246339757015a447e6902ae1c66749b7b2d5a68e514"} err="failed to get container status \"45d76af6acce6ef25d909246339757015a447e6902ae1c66749b7b2d5a68e514\": rpc error: code = NotFound desc = could not find container \"45d76af6acce6ef25d909246339757015a447e6902ae1c66749b7b2d5a68e514\": container with ID starting with 45d76af6acce6ef25d909246339757015a447e6902ae1c66749b7b2d5a68e514 not found: ID does not exist" Dec 02 21:28:46 crc kubenswrapper[4807]: I1202 21:28:46.086781 4807 scope.go:117] "RemoveContainer" containerID="eb60875338ce65224e4ef7e5da56c8219ba82877f04e661ed86832cbf101ecd8" Dec 02 21:28:46 crc kubenswrapper[4807]: E1202 21:28:46.087408 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb60875338ce65224e4ef7e5da56c8219ba82877f04e661ed86832cbf101ecd8\": container with ID starting with eb60875338ce65224e4ef7e5da56c8219ba82877f04e661ed86832cbf101ecd8 not found: ID does not exist" containerID="eb60875338ce65224e4ef7e5da56c8219ba82877f04e661ed86832cbf101ecd8" Dec 02 21:28:46 crc kubenswrapper[4807]: I1202 21:28:46.087444 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb60875338ce65224e4ef7e5da56c8219ba82877f04e661ed86832cbf101ecd8"} err="failed to get container status \"eb60875338ce65224e4ef7e5da56c8219ba82877f04e661ed86832cbf101ecd8\": rpc error: code = NotFound desc = could not find container \"eb60875338ce65224e4ef7e5da56c8219ba82877f04e661ed86832cbf101ecd8\": container with ID starting with eb60875338ce65224e4ef7e5da56c8219ba82877f04e661ed86832cbf101ecd8 not found: ID does not exist" Dec 02 21:28:46 crc kubenswrapper[4807]: I1202 21:28:46.087465 4807 scope.go:117] "RemoveContainer" containerID="45741cf550521b2d67d175302e2776a63fe0845ca152431a07abd30855257a2b" Dec 02 21:28:46 crc kubenswrapper[4807]: E1202 21:28:46.087798 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45741cf550521b2d67d175302e2776a63fe0845ca152431a07abd30855257a2b\": container with ID starting with 45741cf550521b2d67d175302e2776a63fe0845ca152431a07abd30855257a2b not found: ID does not exist" containerID="45741cf550521b2d67d175302e2776a63fe0845ca152431a07abd30855257a2b" Dec 02 21:28:46 crc kubenswrapper[4807]: I1202 21:28:46.087824 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45741cf550521b2d67d175302e2776a63fe0845ca152431a07abd30855257a2b"} err="failed to get container status \"45741cf550521b2d67d175302e2776a63fe0845ca152431a07abd30855257a2b\": rpc error: code = NotFound desc = could not find container \"45741cf550521b2d67d175302e2776a63fe0845ca152431a07abd30855257a2b\": container with ID starting with 45741cf550521b2d67d175302e2776a63fe0845ca152431a07abd30855257a2b not found: ID does not exist" Dec 02 21:28:46 crc kubenswrapper[4807]: E1202 21:28:46.252578 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74e4e4c_baac_4869_b4dd_c4589e30d0e5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74e4e4c_baac_4869_b4dd_c4589e30d0e5.slice/crio-03113a6013d7ce3bddde64cdf2c83f513b1fd6ea60a0008b1ebb20dbb7fb63d7\": RecentStats: unable to find data in memory cache]" Dec 02 21:28:46 crc kubenswrapper[4807]: I1202 21:28:46.984049 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b74e4e4c-baac-4869-b4dd-c4589e30d0e5" path="/var/lib/kubelet/pods/b74e4e4c-baac-4869-b4dd-c4589e30d0e5/volumes" Dec 02 21:28:49 crc kubenswrapper[4807]: I1202 21:28:49.625043 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-mx7pp_d47db1e8-80ed-44a6-9273-9d9fd2b05e33/cert-manager-controller/0.log" Dec 02 21:28:49 crc kubenswrapper[4807]: I1202 21:28:49.733992 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-wkdtr_2d1fe4d2-cc39-4c0c-a5e4-60366c119f94/cert-manager-cainjector/0.log" Dec 02 21:28:49 crc kubenswrapper[4807]: I1202 21:28:49.830571 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-fcc6m_1e90575d-2771-427e-a759-824575491965/cert-manager-webhook/0.log" Dec 02 21:28:56 crc kubenswrapper[4807]: E1202 21:28:56.545127 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74e4e4c_baac_4869_b4dd_c4589e30d0e5.slice/crio-03113a6013d7ce3bddde64cdf2c83f513b1fd6ea60a0008b1ebb20dbb7fb63d7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74e4e4c_baac_4869_b4dd_c4589e30d0e5.slice\": RecentStats: unable to find data in memory cache]" Dec 02 21:29:03 crc kubenswrapper[4807]: I1202 21:29:03.129421 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-skgcp_37bb14e1-531b-43cf-b232-c11257dcf690/nmstate-console-plugin/0.log" Dec 02 21:29:03 crc kubenswrapper[4807]: I1202 21:29:03.305549 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-wd9cv_8b23693d-f1f9-4ae2-9558-44a4a25745bd/nmstate-handler/0.log" Dec 02 21:29:03 crc kubenswrapper[4807]: I1202 21:29:03.389891 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-6vnbp_01df7c9d-768d-417f-a7ed-7865655d889d/nmstate-metrics/0.log" Dec 02 21:29:03 crc kubenswrapper[4807]: I1202 21:29:03.434278 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-6vnbp_01df7c9d-768d-417f-a7ed-7865655d889d/kube-rbac-proxy/0.log" Dec 02 21:29:03 crc kubenswrapper[4807]: I1202 21:29:03.618225 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-tqh42_6f69b73b-75f4-4c02-a252-efd2ea50b022/nmstate-operator/0.log" Dec 02 21:29:03 crc kubenswrapper[4807]: I1202 21:29:03.651105 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-k8qmj_4a32ad98-5354-49e6-957e-ad0828445a24/nmstate-webhook/0.log" Dec 02 21:29:06 crc kubenswrapper[4807]: E1202 21:29:06.834999 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74e4e4c_baac_4869_b4dd_c4589e30d0e5.slice/crio-03113a6013d7ce3bddde64cdf2c83f513b1fd6ea60a0008b1ebb20dbb7fb63d7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74e4e4c_baac_4869_b4dd_c4589e30d0e5.slice\": RecentStats: unable to find data in memory cache]" Dec 02 21:29:17 crc kubenswrapper[4807]: E1202 21:29:17.088781 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74e4e4c_baac_4869_b4dd_c4589e30d0e5.slice/crio-03113a6013d7ce3bddde64cdf2c83f513b1fd6ea60a0008b1ebb20dbb7fb63d7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74e4e4c_baac_4869_b4dd_c4589e30d0e5.slice\": RecentStats: unable to find data in memory cache]" Dec 02 21:29:19 crc kubenswrapper[4807]: I1202 21:29:19.553448 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-qwrnk_af18f5b9-8057-49c0-b0b0-d64a7fff5357/kube-rbac-proxy/0.log" Dec 02 21:29:19 crc kubenswrapper[4807]: I1202 21:29:19.637804 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-qwrnk_af18f5b9-8057-49c0-b0b0-d64a7fff5357/controller/0.log" Dec 02 21:29:19 crc kubenswrapper[4807]: I1202 21:29:19.766133 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-frr-files/0.log" Dec 02 21:29:19 crc kubenswrapper[4807]: I1202 21:29:19.909487 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-frr-files/0.log" Dec 02 21:29:19 crc kubenswrapper[4807]: I1202 21:29:19.934027 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-reloader/0.log" Dec 02 21:29:19 crc kubenswrapper[4807]: I1202 21:29:19.972649 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-reloader/0.log" Dec 02 21:29:19 crc kubenswrapper[4807]: I1202 21:29:19.979968 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-metrics/0.log" Dec 02 21:29:20 crc kubenswrapper[4807]: I1202 21:29:20.183905 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-metrics/0.log" Dec 02 21:29:20 crc kubenswrapper[4807]: I1202 21:29:20.189643 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-reloader/0.log" Dec 02 21:29:20 crc kubenswrapper[4807]: I1202 21:29:20.199225 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-frr-files/0.log" Dec 02 21:29:20 crc kubenswrapper[4807]: I1202 21:29:20.208257 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-metrics/0.log" Dec 02 21:29:20 crc kubenswrapper[4807]: I1202 21:29:20.387301 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-frr-files/0.log" Dec 02 21:29:20 crc kubenswrapper[4807]: I1202 21:29:20.396279 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-reloader/0.log" Dec 02 21:29:20 crc kubenswrapper[4807]: I1202 21:29:20.397781 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-metrics/0.log" Dec 02 21:29:20 crc kubenswrapper[4807]: I1202 21:29:20.412501 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/controller/0.log" Dec 02 21:29:20 crc kubenswrapper[4807]: I1202 21:29:20.572408 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/kube-rbac-proxy/0.log" Dec 02 21:29:20 crc kubenswrapper[4807]: I1202 21:29:20.578451 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/frr-metrics/0.log" Dec 02 21:29:20 crc kubenswrapper[4807]: I1202 21:29:20.643140 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/kube-rbac-proxy-frr/0.log" Dec 02 21:29:20 crc kubenswrapper[4807]: I1202 21:29:20.826765 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/reloader/0.log" Dec 02 21:29:20 crc kubenswrapper[4807]: I1202 21:29:20.909516 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-4hldc_13a694d0-f9dd-46f3-84de-5b6c7b6a9e4f/frr-k8s-webhook-server/0.log" Dec 02 21:29:21 crc kubenswrapper[4807]: I1202 21:29:21.131644 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7684466ddc-qvkp7_a7267c32-ac50-4ee6-8766-f9e586c3bf39/manager/0.log" Dec 02 21:29:21 crc kubenswrapper[4807]: I1202 21:29:21.784656 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-9c8db4b6b-74mvv_40eff5fb-df4c-47e1-bddf-ec09d648f511/webhook-server/0.log" Dec 02 21:29:21 crc kubenswrapper[4807]: I1202 21:29:21.812591 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vgclf_10f57a6c-ce50-4026-a330-b0a195528a92/kube-rbac-proxy/0.log" Dec 02 21:29:22 crc kubenswrapper[4807]: I1202 21:29:22.382668 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/frr/0.log" Dec 02 21:29:22 crc kubenswrapper[4807]: I1202 21:29:22.433028 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vgclf_10f57a6c-ce50-4026-a330-b0a195528a92/speaker/0.log" Dec 02 21:29:27 crc kubenswrapper[4807]: E1202 21:29:27.342172 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74e4e4c_baac_4869_b4dd_c4589e30d0e5.slice/crio-03113a6013d7ce3bddde64cdf2c83f513b1fd6ea60a0008b1ebb20dbb7fb63d7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74e4e4c_baac_4869_b4dd_c4589e30d0e5.slice\": RecentStats: unable to find data in memory cache]" Dec 02 21:29:27 crc kubenswrapper[4807]: I1202 21:29:27.983886 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-22kkj"] Dec 02 21:29:27 crc kubenswrapper[4807]: E1202 21:29:27.985052 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74e4e4c-baac-4869-b4dd-c4589e30d0e5" containerName="extract-utilities" Dec 02 21:29:27 crc kubenswrapper[4807]: I1202 21:29:27.985075 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74e4e4c-baac-4869-b4dd-c4589e30d0e5" containerName="extract-utilities" Dec 02 21:29:27 crc kubenswrapper[4807]: E1202 21:29:27.985105 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74e4e4c-baac-4869-b4dd-c4589e30d0e5" containerName="registry-server" Dec 02 21:29:27 crc kubenswrapper[4807]: I1202 21:29:27.985117 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74e4e4c-baac-4869-b4dd-c4589e30d0e5" containerName="registry-server" Dec 02 21:29:27 crc kubenswrapper[4807]: E1202 21:29:27.985155 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74e4e4c-baac-4869-b4dd-c4589e30d0e5" containerName="extract-content" Dec 02 21:29:27 crc kubenswrapper[4807]: I1202 21:29:27.985167 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74e4e4c-baac-4869-b4dd-c4589e30d0e5" containerName="extract-content" Dec 02 21:29:27 crc kubenswrapper[4807]: I1202 21:29:27.985523 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b74e4e4c-baac-4869-b4dd-c4589e30d0e5" containerName="registry-server" Dec 02 21:29:27 crc kubenswrapper[4807]: I1202 21:29:27.988154 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:28 crc kubenswrapper[4807]: I1202 21:29:28.006461 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-22kkj"] Dec 02 21:29:28 crc kubenswrapper[4807]: I1202 21:29:28.091343 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dede84f-09bb-4825-a4b5-5b131a77e751-catalog-content\") pod \"redhat-marketplace-22kkj\" (UID: \"0dede84f-09bb-4825-a4b5-5b131a77e751\") " pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:28 crc kubenswrapper[4807]: I1202 21:29:28.091600 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b85vg\" (UniqueName: \"kubernetes.io/projected/0dede84f-09bb-4825-a4b5-5b131a77e751-kube-api-access-b85vg\") pod \"redhat-marketplace-22kkj\" (UID: \"0dede84f-09bb-4825-a4b5-5b131a77e751\") " pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:28 crc kubenswrapper[4807]: I1202 21:29:28.091840 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dede84f-09bb-4825-a4b5-5b131a77e751-utilities\") pod \"redhat-marketplace-22kkj\" (UID: \"0dede84f-09bb-4825-a4b5-5b131a77e751\") " pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:28 crc kubenswrapper[4807]: I1202 21:29:28.194230 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b85vg\" (UniqueName: \"kubernetes.io/projected/0dede84f-09bb-4825-a4b5-5b131a77e751-kube-api-access-b85vg\") pod \"redhat-marketplace-22kkj\" (UID: \"0dede84f-09bb-4825-a4b5-5b131a77e751\") " pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:28 crc kubenswrapper[4807]: I1202 21:29:28.194523 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dede84f-09bb-4825-a4b5-5b131a77e751-utilities\") pod \"redhat-marketplace-22kkj\" (UID: \"0dede84f-09bb-4825-a4b5-5b131a77e751\") " pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:28 crc kubenswrapper[4807]: I1202 21:29:28.194654 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dede84f-09bb-4825-a4b5-5b131a77e751-catalog-content\") pod \"redhat-marketplace-22kkj\" (UID: \"0dede84f-09bb-4825-a4b5-5b131a77e751\") " pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:28 crc kubenswrapper[4807]: I1202 21:29:28.195085 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dede84f-09bb-4825-a4b5-5b131a77e751-utilities\") pod \"redhat-marketplace-22kkj\" (UID: \"0dede84f-09bb-4825-a4b5-5b131a77e751\") " pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:28 crc kubenswrapper[4807]: I1202 21:29:28.195098 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dede84f-09bb-4825-a4b5-5b131a77e751-catalog-content\") pod \"redhat-marketplace-22kkj\" (UID: \"0dede84f-09bb-4825-a4b5-5b131a77e751\") " pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:28 crc kubenswrapper[4807]: I1202 21:29:28.213282 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b85vg\" (UniqueName: \"kubernetes.io/projected/0dede84f-09bb-4825-a4b5-5b131a77e751-kube-api-access-b85vg\") pod \"redhat-marketplace-22kkj\" (UID: \"0dede84f-09bb-4825-a4b5-5b131a77e751\") " pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:28 crc kubenswrapper[4807]: I1202 21:29:28.292849 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:29:28 crc kubenswrapper[4807]: I1202 21:29:28.293175 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:29:28 crc kubenswrapper[4807]: I1202 21:29:28.320130 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:28 crc kubenswrapper[4807]: I1202 21:29:28.824807 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-22kkj"] Dec 02 21:29:29 crc kubenswrapper[4807]: I1202 21:29:29.576886 4807 generic.go:334] "Generic (PLEG): container finished" podID="0dede84f-09bb-4825-a4b5-5b131a77e751" containerID="180fcd723b2d22200a6bcf280553d3d12b0720e7fa21334c59b2a59093f52ca5" exitCode=0 Dec 02 21:29:29 crc kubenswrapper[4807]: I1202 21:29:29.576957 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22kkj" event={"ID":"0dede84f-09bb-4825-a4b5-5b131a77e751","Type":"ContainerDied","Data":"180fcd723b2d22200a6bcf280553d3d12b0720e7fa21334c59b2a59093f52ca5"} Dec 02 21:29:29 crc kubenswrapper[4807]: I1202 21:29:29.577336 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22kkj" event={"ID":"0dede84f-09bb-4825-a4b5-5b131a77e751","Type":"ContainerStarted","Data":"2d65df1e365a14efd27ca2ed60c211d4f8c72f8baf41f2567e85cb65e5465abf"} Dec 02 21:29:31 crc kubenswrapper[4807]: I1202 21:29:31.602744 4807 generic.go:334] "Generic (PLEG): container finished" podID="0dede84f-09bb-4825-a4b5-5b131a77e751" containerID="8ae19e1fe0daa3992472576f17ff054707e82355b7dab5e6b19d6a36672710b9" exitCode=0 Dec 02 21:29:31 crc kubenswrapper[4807]: I1202 21:29:31.602997 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22kkj" event={"ID":"0dede84f-09bb-4825-a4b5-5b131a77e751","Type":"ContainerDied","Data":"8ae19e1fe0daa3992472576f17ff054707e82355b7dab5e6b19d6a36672710b9"} Dec 02 21:29:33 crc kubenswrapper[4807]: I1202 21:29:33.629496 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22kkj" event={"ID":"0dede84f-09bb-4825-a4b5-5b131a77e751","Type":"ContainerStarted","Data":"b68f48401a471dc28880be63ac333b990c46ba2454c93c312f1c8da46ef57ae2"} Dec 02 21:29:33 crc kubenswrapper[4807]: I1202 21:29:33.659548 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-22kkj" podStartSLOduration=4.106373848 podStartE2EDuration="6.659508475s" podCreationTimestamp="2025-12-02 21:29:27 +0000 UTC" firstStartedPulling="2025-12-02 21:29:29.579310731 +0000 UTC m=+5504.880218226" lastFinishedPulling="2025-12-02 21:29:32.132445318 +0000 UTC m=+5507.433352853" observedRunningTime="2025-12-02 21:29:33.655173932 +0000 UTC m=+5508.956081427" watchObservedRunningTime="2025-12-02 21:29:33.659508475 +0000 UTC m=+5508.960415970" Dec 02 21:29:37 crc kubenswrapper[4807]: I1202 21:29:37.146985 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx_718520aa-df66-40e9-a10a-ac83475f1997/util/0.log" Dec 02 21:29:37 crc kubenswrapper[4807]: I1202 21:29:37.147081 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx_718520aa-df66-40e9-a10a-ac83475f1997/util/0.log" Dec 02 21:29:37 crc kubenswrapper[4807]: I1202 21:29:37.213645 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx_718520aa-df66-40e9-a10a-ac83475f1997/pull/0.log" Dec 02 21:29:37 crc kubenswrapper[4807]: I1202 21:29:37.348263 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx_718520aa-df66-40e9-a10a-ac83475f1997/pull/0.log" Dec 02 21:29:37 crc kubenswrapper[4807]: I1202 21:29:37.520217 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx_718520aa-df66-40e9-a10a-ac83475f1997/util/0.log" Dec 02 21:29:37 crc kubenswrapper[4807]: I1202 21:29:37.526450 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx_718520aa-df66-40e9-a10a-ac83475f1997/extract/0.log" Dec 02 21:29:37 crc kubenswrapper[4807]: I1202 21:29:37.557167 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx_718520aa-df66-40e9-a10a-ac83475f1997/pull/0.log" Dec 02 21:29:37 crc kubenswrapper[4807]: E1202 21:29:37.635167 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74e4e4c_baac_4869_b4dd_c4589e30d0e5.slice/crio-03113a6013d7ce3bddde64cdf2c83f513b1fd6ea60a0008b1ebb20dbb7fb63d7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74e4e4c_baac_4869_b4dd_c4589e30d0e5.slice\": RecentStats: unable to find data in memory cache]" Dec 02 21:29:37 crc kubenswrapper[4807]: I1202 21:29:37.705773 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc_25348ba1-760f-46a3-9f25-4054fb9ebed4/util/0.log" Dec 02 21:29:37 crc kubenswrapper[4807]: I1202 21:29:37.849514 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc_25348ba1-760f-46a3-9f25-4054fb9ebed4/util/0.log" Dec 02 21:29:37 crc kubenswrapper[4807]: I1202 21:29:37.876658 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc_25348ba1-760f-46a3-9f25-4054fb9ebed4/pull/0.log" Dec 02 21:29:37 crc kubenswrapper[4807]: I1202 21:29:37.880436 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc_25348ba1-760f-46a3-9f25-4054fb9ebed4/pull/0.log" Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.036963 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc_25348ba1-760f-46a3-9f25-4054fb9ebed4/pull/0.log" Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.043909 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc_25348ba1-760f-46a3-9f25-4054fb9ebed4/util/0.log" Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.056029 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc_25348ba1-760f-46a3-9f25-4054fb9ebed4/extract/0.log" Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.210894 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g_9a9aee04-5b5a-4c8b-a0e8-b16ed522d829/util/0.log" Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.321016 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.321091 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.376260 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.392392 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g_9a9aee04-5b5a-4c8b-a0e8-b16ed522d829/util/0.log" Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.444122 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g_9a9aee04-5b5a-4c8b-a0e8-b16ed522d829/pull/0.log" Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.446200 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g_9a9aee04-5b5a-4c8b-a0e8-b16ed522d829/pull/0.log" Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.614796 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g_9a9aee04-5b5a-4c8b-a0e8-b16ed522d829/util/0.log" Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.615743 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g_9a9aee04-5b5a-4c8b-a0e8-b16ed522d829/pull/0.log" Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.658763 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g_9a9aee04-5b5a-4c8b-a0e8-b16ed522d829/extract/0.log" Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.742894 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.788719 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-22kkj"] Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.825578 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qfvxm_c593c7c1-9bcc-4c52-92de-818b1cae7d51/extract-utilities/0.log" Dec 02 21:29:38 crc kubenswrapper[4807]: I1202 21:29:38.985976 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qfvxm_c593c7c1-9bcc-4c52-92de-818b1cae7d51/extract-content/0.log" Dec 02 21:29:39 crc kubenswrapper[4807]: I1202 21:29:39.008138 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qfvxm_c593c7c1-9bcc-4c52-92de-818b1cae7d51/extract-utilities/0.log" Dec 02 21:29:39 crc kubenswrapper[4807]: I1202 21:29:39.016671 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qfvxm_c593c7c1-9bcc-4c52-92de-818b1cae7d51/extract-content/0.log" Dec 02 21:29:39 crc kubenswrapper[4807]: I1202 21:29:39.223360 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qfvxm_c593c7c1-9bcc-4c52-92de-818b1cae7d51/extract-utilities/0.log" Dec 02 21:29:39 crc kubenswrapper[4807]: I1202 21:29:39.240458 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qfvxm_c593c7c1-9bcc-4c52-92de-818b1cae7d51/extract-content/0.log" Dec 02 21:29:39 crc kubenswrapper[4807]: I1202 21:29:39.417909 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khjdb_b1e5d8f8-0730-44b0-beb7-652e4b9461bd/extract-utilities/0.log" Dec 02 21:29:39 crc kubenswrapper[4807]: I1202 21:29:39.644616 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khjdb_b1e5d8f8-0730-44b0-beb7-652e4b9461bd/extract-utilities/0.log" Dec 02 21:29:39 crc kubenswrapper[4807]: I1202 21:29:39.653728 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khjdb_b1e5d8f8-0730-44b0-beb7-652e4b9461bd/extract-content/0.log" Dec 02 21:29:39 crc kubenswrapper[4807]: I1202 21:29:39.719447 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khjdb_b1e5d8f8-0730-44b0-beb7-652e4b9461bd/extract-content/0.log" Dec 02 21:29:39 crc kubenswrapper[4807]: I1202 21:29:39.855027 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khjdb_b1e5d8f8-0730-44b0-beb7-652e4b9461bd/extract-utilities/0.log" Dec 02 21:29:39 crc kubenswrapper[4807]: I1202 21:29:39.886480 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khjdb_b1e5d8f8-0730-44b0-beb7-652e4b9461bd/extract-content/0.log" Dec 02 21:29:39 crc kubenswrapper[4807]: I1202 21:29:39.980358 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qfvxm_c593c7c1-9bcc-4c52-92de-818b1cae7d51/registry-server/0.log" Dec 02 21:29:40 crc kubenswrapper[4807]: I1202 21:29:40.145850 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xd6tg_ffb3e245-8658-45b6-b784-250cd6d34a93/marketplace-operator/0.log" Dec 02 21:29:40 crc kubenswrapper[4807]: I1202 21:29:40.327188 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22kkj_0dede84f-09bb-4825-a4b5-5b131a77e751/extract-utilities/0.log" Dec 02 21:29:40 crc kubenswrapper[4807]: I1202 21:29:40.651093 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22kkj_0dede84f-09bb-4825-a4b5-5b131a77e751/extract-content/0.log" Dec 02 21:29:40 crc kubenswrapper[4807]: I1202 21:29:40.662941 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22kkj_0dede84f-09bb-4825-a4b5-5b131a77e751/extract-utilities/0.log" Dec 02 21:29:40 crc kubenswrapper[4807]: I1202 21:29:40.681069 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22kkj_0dede84f-09bb-4825-a4b5-5b131a77e751/extract-content/0.log" Dec 02 21:29:40 crc kubenswrapper[4807]: I1202 21:29:40.707240 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-22kkj" podUID="0dede84f-09bb-4825-a4b5-5b131a77e751" containerName="registry-server" containerID="cri-o://b68f48401a471dc28880be63ac333b990c46ba2454c93c312f1c8da46ef57ae2" gracePeriod=2 Dec 02 21:29:40 crc kubenswrapper[4807]: I1202 21:29:40.815339 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khjdb_b1e5d8f8-0730-44b0-beb7-652e4b9461bd/registry-server/0.log" Dec 02 21:29:40 crc kubenswrapper[4807]: I1202 21:29:40.998229 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22kkj_0dede84f-09bb-4825-a4b5-5b131a77e751/extract-utilities/0.log" Dec 02 21:29:40 crc kubenswrapper[4807]: I1202 21:29:40.999070 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22kkj_0dede84f-09bb-4825-a4b5-5b131a77e751/registry-server/0.log" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.082141 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jl5sd_658f9914-a1d3-4c38-a8fd-ef69123b8f0a/extract-utilities/0.log" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.103663 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-22kkj_0dede84f-09bb-4825-a4b5-5b131a77e751/extract-content/0.log" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.217691 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.237218 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jl5sd_658f9914-a1d3-4c38-a8fd-ef69123b8f0a/extract-utilities/0.log" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.299352 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jl5sd_658f9914-a1d3-4c38-a8fd-ef69123b8f0a/extract-content/0.log" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.312874 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jl5sd_658f9914-a1d3-4c38-a8fd-ef69123b8f0a/extract-content/0.log" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.390743 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dede84f-09bb-4825-a4b5-5b131a77e751-utilities\") pod \"0dede84f-09bb-4825-a4b5-5b131a77e751\" (UID: \"0dede84f-09bb-4825-a4b5-5b131a77e751\") " Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.390820 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b85vg\" (UniqueName: \"kubernetes.io/projected/0dede84f-09bb-4825-a4b5-5b131a77e751-kube-api-access-b85vg\") pod \"0dede84f-09bb-4825-a4b5-5b131a77e751\" (UID: \"0dede84f-09bb-4825-a4b5-5b131a77e751\") " Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.390902 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dede84f-09bb-4825-a4b5-5b131a77e751-catalog-content\") pod \"0dede84f-09bb-4825-a4b5-5b131a77e751\" (UID: \"0dede84f-09bb-4825-a4b5-5b131a77e751\") " Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.391873 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dede84f-09bb-4825-a4b5-5b131a77e751-utilities" (OuterVolumeSpecName: "utilities") pod "0dede84f-09bb-4825-a4b5-5b131a77e751" (UID: "0dede84f-09bb-4825-a4b5-5b131a77e751"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.393806 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dede84f-09bb-4825-a4b5-5b131a77e751-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.400550 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dede84f-09bb-4825-a4b5-5b131a77e751-kube-api-access-b85vg" (OuterVolumeSpecName: "kube-api-access-b85vg") pod "0dede84f-09bb-4825-a4b5-5b131a77e751" (UID: "0dede84f-09bb-4825-a4b5-5b131a77e751"). InnerVolumeSpecName "kube-api-access-b85vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.414511 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dede84f-09bb-4825-a4b5-5b131a77e751-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0dede84f-09bb-4825-a4b5-5b131a77e751" (UID: "0dede84f-09bb-4825-a4b5-5b131a77e751"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.461864 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jl5sd_658f9914-a1d3-4c38-a8fd-ef69123b8f0a/extract-content/0.log" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.494633 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b85vg\" (UniqueName: \"kubernetes.io/projected/0dede84f-09bb-4825-a4b5-5b131a77e751-kube-api-access-b85vg\") on node \"crc\" DevicePath \"\"" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.494668 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dede84f-09bb-4825-a4b5-5b131a77e751-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.535361 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vd6mt_13abe6c3-0a67-491f-abce-fc06c82b4707/extract-utilities/0.log" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.561855 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jl5sd_658f9914-a1d3-4c38-a8fd-ef69123b8f0a/extract-utilities/0.log" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.715019 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jl5sd_658f9914-a1d3-4c38-a8fd-ef69123b8f0a/registry-server/0.log" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.720993 4807 generic.go:334] "Generic (PLEG): container finished" podID="0dede84f-09bb-4825-a4b5-5b131a77e751" containerID="b68f48401a471dc28880be63ac333b990c46ba2454c93c312f1c8da46ef57ae2" exitCode=0 Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.721041 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22kkj" event={"ID":"0dede84f-09bb-4825-a4b5-5b131a77e751","Type":"ContainerDied","Data":"b68f48401a471dc28880be63ac333b990c46ba2454c93c312f1c8da46ef57ae2"} Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.721069 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-22kkj" event={"ID":"0dede84f-09bb-4825-a4b5-5b131a77e751","Type":"ContainerDied","Data":"2d65df1e365a14efd27ca2ed60c211d4f8c72f8baf41f2567e85cb65e5465abf"} Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.721088 4807 scope.go:117] "RemoveContainer" containerID="b68f48401a471dc28880be63ac333b990c46ba2454c93c312f1c8da46ef57ae2" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.721243 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-22kkj" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.750164 4807 scope.go:117] "RemoveContainer" containerID="8ae19e1fe0daa3992472576f17ff054707e82355b7dab5e6b19d6a36672710b9" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.755543 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-22kkj"] Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.768221 4807 scope.go:117] "RemoveContainer" containerID="180fcd723b2d22200a6bcf280553d3d12b0720e7fa21334c59b2a59093f52ca5" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.775405 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-22kkj"] Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.776497 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vd6mt_13abe6c3-0a67-491f-abce-fc06c82b4707/extract-utilities/0.log" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.800138 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vd6mt_13abe6c3-0a67-491f-abce-fc06c82b4707/extract-content/0.log" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.814884 4807 scope.go:117] "RemoveContainer" containerID="b68f48401a471dc28880be63ac333b990c46ba2454c93c312f1c8da46ef57ae2" Dec 02 21:29:41 crc kubenswrapper[4807]: E1202 21:29:41.816786 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b68f48401a471dc28880be63ac333b990c46ba2454c93c312f1c8da46ef57ae2\": container with ID starting with b68f48401a471dc28880be63ac333b990c46ba2454c93c312f1c8da46ef57ae2 not found: ID does not exist" containerID="b68f48401a471dc28880be63ac333b990c46ba2454c93c312f1c8da46ef57ae2" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.816827 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b68f48401a471dc28880be63ac333b990c46ba2454c93c312f1c8da46ef57ae2"} err="failed to get container status \"b68f48401a471dc28880be63ac333b990c46ba2454c93c312f1c8da46ef57ae2\": rpc error: code = NotFound desc = could not find container \"b68f48401a471dc28880be63ac333b990c46ba2454c93c312f1c8da46ef57ae2\": container with ID starting with b68f48401a471dc28880be63ac333b990c46ba2454c93c312f1c8da46ef57ae2 not found: ID does not exist" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.816857 4807 scope.go:117] "RemoveContainer" containerID="8ae19e1fe0daa3992472576f17ff054707e82355b7dab5e6b19d6a36672710b9" Dec 02 21:29:41 crc kubenswrapper[4807]: E1202 21:29:41.821602 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ae19e1fe0daa3992472576f17ff054707e82355b7dab5e6b19d6a36672710b9\": container with ID starting with 8ae19e1fe0daa3992472576f17ff054707e82355b7dab5e6b19d6a36672710b9 not found: ID does not exist" containerID="8ae19e1fe0daa3992472576f17ff054707e82355b7dab5e6b19d6a36672710b9" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.821648 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ae19e1fe0daa3992472576f17ff054707e82355b7dab5e6b19d6a36672710b9"} err="failed to get container status \"8ae19e1fe0daa3992472576f17ff054707e82355b7dab5e6b19d6a36672710b9\": rpc error: code = NotFound desc = could not find container \"8ae19e1fe0daa3992472576f17ff054707e82355b7dab5e6b19d6a36672710b9\": container with ID starting with 8ae19e1fe0daa3992472576f17ff054707e82355b7dab5e6b19d6a36672710b9 not found: ID does not exist" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.821671 4807 scope.go:117] "RemoveContainer" containerID="180fcd723b2d22200a6bcf280553d3d12b0720e7fa21334c59b2a59093f52ca5" Dec 02 21:29:41 crc kubenswrapper[4807]: E1202 21:29:41.823014 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"180fcd723b2d22200a6bcf280553d3d12b0720e7fa21334c59b2a59093f52ca5\": container with ID starting with 180fcd723b2d22200a6bcf280553d3d12b0720e7fa21334c59b2a59093f52ca5 not found: ID does not exist" containerID="180fcd723b2d22200a6bcf280553d3d12b0720e7fa21334c59b2a59093f52ca5" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.823080 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"180fcd723b2d22200a6bcf280553d3d12b0720e7fa21334c59b2a59093f52ca5"} err="failed to get container status \"180fcd723b2d22200a6bcf280553d3d12b0720e7fa21334c59b2a59093f52ca5\": rpc error: code = NotFound desc = could not find container \"180fcd723b2d22200a6bcf280553d3d12b0720e7fa21334c59b2a59093f52ca5\": container with ID starting with 180fcd723b2d22200a6bcf280553d3d12b0720e7fa21334c59b2a59093f52ca5 not found: ID does not exist" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.843420 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vd6mt_13abe6c3-0a67-491f-abce-fc06c82b4707/extract-content/0.log" Dec 02 21:29:41 crc kubenswrapper[4807]: I1202 21:29:41.995447 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vd6mt_13abe6c3-0a67-491f-abce-fc06c82b4707/extract-content/0.log" Dec 02 21:29:42 crc kubenswrapper[4807]: I1202 21:29:42.007224 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vd6mt_13abe6c3-0a67-491f-abce-fc06c82b4707/extract-utilities/0.log" Dec 02 21:29:42 crc kubenswrapper[4807]: I1202 21:29:42.199686 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vd6mt_13abe6c3-0a67-491f-abce-fc06c82b4707/registry-server/0.log" Dec 02 21:29:42 crc kubenswrapper[4807]: I1202 21:29:42.989078 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dede84f-09bb-4825-a4b5-5b131a77e751" path="/var/lib/kubelet/pods/0dede84f-09bb-4825-a4b5-5b131a77e751/volumes" Dec 02 21:29:57 crc kubenswrapper[4807]: I1202 21:29:57.673036 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-wv8s9_11697d44-d0d8-49be-ada1-7de7ab69950b/prometheus-operator/0.log" Dec 02 21:29:57 crc kubenswrapper[4807]: I1202 21:29:57.712099 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw_d45946cf-5cdf-4461-a0c0-c90b1367e919/prometheus-operator-admission-webhook/0.log" Dec 02 21:29:57 crc kubenswrapper[4807]: I1202 21:29:57.872071 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr_a50d34d7-348c-4533-8b5b-8c5f3ee88af3/prometheus-operator-admission-webhook/0.log" Dec 02 21:29:57 crc kubenswrapper[4807]: I1202 21:29:57.970433 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-m69t7_818c8714-3224-4307-92c3-efc98ece9f1d/operator/0.log" Dec 02 21:29:58 crc kubenswrapper[4807]: I1202 21:29:58.088680 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-glnxl_6d0398c9-073c-437e-a5cf-e8abec984ebe/perses-operator/0.log" Dec 02 21:29:58 crc kubenswrapper[4807]: I1202 21:29:58.293605 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:29:58 crc kubenswrapper[4807]: I1202 21:29:58.294008 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.165626 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt"] Dec 02 21:30:00 crc kubenswrapper[4807]: E1202 21:30:00.166372 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dede84f-09bb-4825-a4b5-5b131a77e751" containerName="extract-content" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.166388 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dede84f-09bb-4825-a4b5-5b131a77e751" containerName="extract-content" Dec 02 21:30:00 crc kubenswrapper[4807]: E1202 21:30:00.166421 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dede84f-09bb-4825-a4b5-5b131a77e751" containerName="registry-server" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.166427 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dede84f-09bb-4825-a4b5-5b131a77e751" containerName="registry-server" Dec 02 21:30:00 crc kubenswrapper[4807]: E1202 21:30:00.166454 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dede84f-09bb-4825-a4b5-5b131a77e751" containerName="extract-utilities" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.166460 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dede84f-09bb-4825-a4b5-5b131a77e751" containerName="extract-utilities" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.166639 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dede84f-09bb-4825-a4b5-5b131a77e751" containerName="registry-server" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.167446 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.170185 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.170334 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.188074 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt"] Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.338892 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f218241-6a60-4ae7-a753-d7748c8a01c8-config-volume\") pod \"collect-profiles-29411850-wv7lt\" (UID: \"5f218241-6a60-4ae7-a753-d7748c8a01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.339541 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzz72\" (UniqueName: \"kubernetes.io/projected/5f218241-6a60-4ae7-a753-d7748c8a01c8-kube-api-access-lzz72\") pod \"collect-profiles-29411850-wv7lt\" (UID: \"5f218241-6a60-4ae7-a753-d7748c8a01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.339931 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f218241-6a60-4ae7-a753-d7748c8a01c8-secret-volume\") pod \"collect-profiles-29411850-wv7lt\" (UID: \"5f218241-6a60-4ae7-a753-d7748c8a01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.441815 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f218241-6a60-4ae7-a753-d7748c8a01c8-secret-volume\") pod \"collect-profiles-29411850-wv7lt\" (UID: \"5f218241-6a60-4ae7-a753-d7748c8a01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.441919 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f218241-6a60-4ae7-a753-d7748c8a01c8-config-volume\") pod \"collect-profiles-29411850-wv7lt\" (UID: \"5f218241-6a60-4ae7-a753-d7748c8a01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.442131 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzz72\" (UniqueName: \"kubernetes.io/projected/5f218241-6a60-4ae7-a753-d7748c8a01c8-kube-api-access-lzz72\") pod \"collect-profiles-29411850-wv7lt\" (UID: \"5f218241-6a60-4ae7-a753-d7748c8a01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.442924 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f218241-6a60-4ae7-a753-d7748c8a01c8-config-volume\") pod \"collect-profiles-29411850-wv7lt\" (UID: \"5f218241-6a60-4ae7-a753-d7748c8a01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.448780 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f218241-6a60-4ae7-a753-d7748c8a01c8-secret-volume\") pod \"collect-profiles-29411850-wv7lt\" (UID: \"5f218241-6a60-4ae7-a753-d7748c8a01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.462741 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzz72\" (UniqueName: \"kubernetes.io/projected/5f218241-6a60-4ae7-a753-d7748c8a01c8-kube-api-access-lzz72\") pod \"collect-profiles-29411850-wv7lt\" (UID: \"5f218241-6a60-4ae7-a753-d7748c8a01c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt" Dec 02 21:30:00 crc kubenswrapper[4807]: I1202 21:30:00.510163 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt" Dec 02 21:30:01 crc kubenswrapper[4807]: I1202 21:30:01.074841 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt"] Dec 02 21:30:01 crc kubenswrapper[4807]: I1202 21:30:01.930177 4807 generic.go:334] "Generic (PLEG): container finished" podID="5f218241-6a60-4ae7-a753-d7748c8a01c8" containerID="4e197596830f04802202c557f4054127abedf88800319096235febb92f18f307" exitCode=0 Dec 02 21:30:01 crc kubenswrapper[4807]: I1202 21:30:01.930290 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt" event={"ID":"5f218241-6a60-4ae7-a753-d7748c8a01c8","Type":"ContainerDied","Data":"4e197596830f04802202c557f4054127abedf88800319096235febb92f18f307"} Dec 02 21:30:01 crc kubenswrapper[4807]: I1202 21:30:01.930428 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt" event={"ID":"5f218241-6a60-4ae7-a753-d7748c8a01c8","Type":"ContainerStarted","Data":"3b77f73cd6f5b62b5cfe12134700cb93cb73223adf525dc778f73522545c899f"} Dec 02 21:30:03 crc kubenswrapper[4807]: I1202 21:30:03.305256 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt" Dec 02 21:30:03 crc kubenswrapper[4807]: I1202 21:30:03.306981 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f218241-6a60-4ae7-a753-d7748c8a01c8-secret-volume\") pod \"5f218241-6a60-4ae7-a753-d7748c8a01c8\" (UID: \"5f218241-6a60-4ae7-a753-d7748c8a01c8\") " Dec 02 21:30:03 crc kubenswrapper[4807]: I1202 21:30:03.307080 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f218241-6a60-4ae7-a753-d7748c8a01c8-config-volume\") pod \"5f218241-6a60-4ae7-a753-d7748c8a01c8\" (UID: \"5f218241-6a60-4ae7-a753-d7748c8a01c8\") " Dec 02 21:30:03 crc kubenswrapper[4807]: I1202 21:30:03.307129 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzz72\" (UniqueName: \"kubernetes.io/projected/5f218241-6a60-4ae7-a753-d7748c8a01c8-kube-api-access-lzz72\") pod \"5f218241-6a60-4ae7-a753-d7748c8a01c8\" (UID: \"5f218241-6a60-4ae7-a753-d7748c8a01c8\") " Dec 02 21:30:03 crc kubenswrapper[4807]: I1202 21:30:03.307981 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f218241-6a60-4ae7-a753-d7748c8a01c8-config-volume" (OuterVolumeSpecName: "config-volume") pod "5f218241-6a60-4ae7-a753-d7748c8a01c8" (UID: "5f218241-6a60-4ae7-a753-d7748c8a01c8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 21:30:03 crc kubenswrapper[4807]: I1202 21:30:03.308380 4807 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f218241-6a60-4ae7-a753-d7748c8a01c8-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 21:30:03 crc kubenswrapper[4807]: I1202 21:30:03.313648 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f218241-6a60-4ae7-a753-d7748c8a01c8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5f218241-6a60-4ae7-a753-d7748c8a01c8" (UID: "5f218241-6a60-4ae7-a753-d7748c8a01c8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 21:30:03 crc kubenswrapper[4807]: I1202 21:30:03.319683 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f218241-6a60-4ae7-a753-d7748c8a01c8-kube-api-access-lzz72" (OuterVolumeSpecName: "kube-api-access-lzz72") pod "5f218241-6a60-4ae7-a753-d7748c8a01c8" (UID: "5f218241-6a60-4ae7-a753-d7748c8a01c8"). InnerVolumeSpecName "kube-api-access-lzz72". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:30:03 crc kubenswrapper[4807]: I1202 21:30:03.409211 4807 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f218241-6a60-4ae7-a753-d7748c8a01c8-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 21:30:03 crc kubenswrapper[4807]: I1202 21:30:03.409528 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzz72\" (UniqueName: \"kubernetes.io/projected/5f218241-6a60-4ae7-a753-d7748c8a01c8-kube-api-access-lzz72\") on node \"crc\" DevicePath \"\"" Dec 02 21:30:03 crc kubenswrapper[4807]: I1202 21:30:03.952584 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt" event={"ID":"5f218241-6a60-4ae7-a753-d7748c8a01c8","Type":"ContainerDied","Data":"3b77f73cd6f5b62b5cfe12134700cb93cb73223adf525dc778f73522545c899f"} Dec 02 21:30:03 crc kubenswrapper[4807]: I1202 21:30:03.952629 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b77f73cd6f5b62b5cfe12134700cb93cb73223adf525dc778f73522545c899f" Dec 02 21:30:03 crc kubenswrapper[4807]: I1202 21:30:03.952652 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411850-wv7lt" Dec 02 21:30:04 crc kubenswrapper[4807]: I1202 21:30:04.403633 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8"] Dec 02 21:30:04 crc kubenswrapper[4807]: I1202 21:30:04.413519 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411805-gddq8"] Dec 02 21:30:04 crc kubenswrapper[4807]: I1202 21:30:04.991486 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d24dcd-2cef-49d6-8770-30f4051c4cf5" path="/var/lib/kubelet/pods/d7d24dcd-2cef-49d6-8770-30f4051c4cf5/volumes" Dec 02 21:30:13 crc kubenswrapper[4807]: I1202 21:30:13.727617 4807 scope.go:117] "RemoveContainer" containerID="8fa998dee75151db0096eb8857e1e77cde51172ec22cfd9c10953aba30b62159" Dec 02 21:30:28 crc kubenswrapper[4807]: I1202 21:30:28.292913 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:30:28 crc kubenswrapper[4807]: I1202 21:30:28.293492 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:30:28 crc kubenswrapper[4807]: I1202 21:30:28.293561 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 21:30:28 crc kubenswrapper[4807]: I1202 21:30:28.294630 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea004cd87b9681bdd1d161f6f120895bdeed10d1a2a654222855ee3a5eb7c083"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 21:30:28 crc kubenswrapper[4807]: I1202 21:30:28.294743 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://ea004cd87b9681bdd1d161f6f120895bdeed10d1a2a654222855ee3a5eb7c083" gracePeriod=600 Dec 02 21:30:29 crc kubenswrapper[4807]: I1202 21:30:29.235853 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="ea004cd87b9681bdd1d161f6f120895bdeed10d1a2a654222855ee3a5eb7c083" exitCode=0 Dec 02 21:30:29 crc kubenswrapper[4807]: I1202 21:30:29.235930 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"ea004cd87b9681bdd1d161f6f120895bdeed10d1a2a654222855ee3a5eb7c083"} Dec 02 21:30:29 crc kubenswrapper[4807]: I1202 21:30:29.236340 4807 scope.go:117] "RemoveContainer" containerID="4b3f5f86379c59cd495e0ee61f0a345522d24ef04a1996a9e8aeaf0e347cc394" Dec 02 21:30:30 crc kubenswrapper[4807]: I1202 21:30:30.266339 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96"} Dec 02 21:31:48 crc kubenswrapper[4807]: I1202 21:31:48.249297 4807 generic.go:334] "Generic (PLEG): container finished" podID="c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8" containerID="944da8437d8fce7c39b4f18c52c171de3bf2c11b0a908439d59cf47fdfd16acb" exitCode=0 Dec 02 21:31:48 crc kubenswrapper[4807]: I1202 21:31:48.249435 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7hxrb/must-gather-lx7rf" event={"ID":"c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8","Type":"ContainerDied","Data":"944da8437d8fce7c39b4f18c52c171de3bf2c11b0a908439d59cf47fdfd16acb"} Dec 02 21:31:48 crc kubenswrapper[4807]: I1202 21:31:48.251097 4807 scope.go:117] "RemoveContainer" containerID="944da8437d8fce7c39b4f18c52c171de3bf2c11b0a908439d59cf47fdfd16acb" Dec 02 21:31:49 crc kubenswrapper[4807]: I1202 21:31:49.054493 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7hxrb_must-gather-lx7rf_c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8/gather/0.log" Dec 02 21:31:58 crc kubenswrapper[4807]: I1202 21:31:58.562787 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7hxrb/must-gather-lx7rf"] Dec 02 21:31:58 crc kubenswrapper[4807]: I1202 21:31:58.563515 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7hxrb/must-gather-lx7rf" podUID="c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8" containerName="copy" containerID="cri-o://a0c400eac489521c1301d42c8c239c734fbce30a6cddf1ca737f24082266f21e" gracePeriod=2 Dec 02 21:31:58 crc kubenswrapper[4807]: I1202 21:31:58.571650 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7hxrb/must-gather-lx7rf"] Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.050229 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7hxrb_must-gather-lx7rf_c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8/copy/0.log" Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.050981 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hxrb/must-gather-lx7rf" Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.216420 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpsgd\" (UniqueName: \"kubernetes.io/projected/c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8-kube-api-access-zpsgd\") pod \"c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8\" (UID: \"c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8\") " Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.216560 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8-must-gather-output\") pod \"c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8\" (UID: \"c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8\") " Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.223252 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8-kube-api-access-zpsgd" (OuterVolumeSpecName: "kube-api-access-zpsgd") pod "c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8" (UID: "c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8"). InnerVolumeSpecName "kube-api-access-zpsgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.319100 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpsgd\" (UniqueName: \"kubernetes.io/projected/c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8-kube-api-access-zpsgd\") on node \"crc\" DevicePath \"\"" Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.360450 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7hxrb_must-gather-lx7rf_c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8/copy/0.log" Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.361224 4807 generic.go:334] "Generic (PLEG): container finished" podID="c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8" containerID="a0c400eac489521c1301d42c8c239c734fbce30a6cddf1ca737f24082266f21e" exitCode=143 Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.361287 4807 scope.go:117] "RemoveContainer" containerID="a0c400eac489521c1301d42c8c239c734fbce30a6cddf1ca737f24082266f21e" Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.361462 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7hxrb/must-gather-lx7rf" Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.385471 4807 scope.go:117] "RemoveContainer" containerID="944da8437d8fce7c39b4f18c52c171de3bf2c11b0a908439d59cf47fdfd16acb" Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.405535 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8" (UID: "c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.421297 4807 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.475755 4807 scope.go:117] "RemoveContainer" containerID="a0c400eac489521c1301d42c8c239c734fbce30a6cddf1ca737f24082266f21e" Dec 02 21:31:59 crc kubenswrapper[4807]: E1202 21:31:59.476185 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0c400eac489521c1301d42c8c239c734fbce30a6cddf1ca737f24082266f21e\": container with ID starting with a0c400eac489521c1301d42c8c239c734fbce30a6cddf1ca737f24082266f21e not found: ID does not exist" containerID="a0c400eac489521c1301d42c8c239c734fbce30a6cddf1ca737f24082266f21e" Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.476228 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c400eac489521c1301d42c8c239c734fbce30a6cddf1ca737f24082266f21e"} err="failed to get container status \"a0c400eac489521c1301d42c8c239c734fbce30a6cddf1ca737f24082266f21e\": rpc error: code = NotFound desc = could not find container \"a0c400eac489521c1301d42c8c239c734fbce30a6cddf1ca737f24082266f21e\": container with ID starting with a0c400eac489521c1301d42c8c239c734fbce30a6cddf1ca737f24082266f21e not found: ID does not exist" Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.476255 4807 scope.go:117] "RemoveContainer" containerID="944da8437d8fce7c39b4f18c52c171de3bf2c11b0a908439d59cf47fdfd16acb" Dec 02 21:31:59 crc kubenswrapper[4807]: E1202 21:31:59.476802 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"944da8437d8fce7c39b4f18c52c171de3bf2c11b0a908439d59cf47fdfd16acb\": container with ID starting with 944da8437d8fce7c39b4f18c52c171de3bf2c11b0a908439d59cf47fdfd16acb not found: ID does not exist" containerID="944da8437d8fce7c39b4f18c52c171de3bf2c11b0a908439d59cf47fdfd16acb" Dec 02 21:31:59 crc kubenswrapper[4807]: I1202 21:31:59.476834 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"944da8437d8fce7c39b4f18c52c171de3bf2c11b0a908439d59cf47fdfd16acb"} err="failed to get container status \"944da8437d8fce7c39b4f18c52c171de3bf2c11b0a908439d59cf47fdfd16acb\": rpc error: code = NotFound desc = could not find container \"944da8437d8fce7c39b4f18c52c171de3bf2c11b0a908439d59cf47fdfd16acb\": container with ID starting with 944da8437d8fce7c39b4f18c52c171de3bf2c11b0a908439d59cf47fdfd16acb not found: ID does not exist" Dec 02 21:32:00 crc kubenswrapper[4807]: I1202 21:32:00.987104 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8" path="/var/lib/kubelet/pods/c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8/volumes" Dec 02 21:32:13 crc kubenswrapper[4807]: I1202 21:32:13.870249 4807 scope.go:117] "RemoveContainer" containerID="4fc04e70826148105bea0d04586d5023cae7f60628adc130cb839ab2ba04edfd" Dec 02 21:32:58 crc kubenswrapper[4807]: I1202 21:32:58.293607 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:32:58 crc kubenswrapper[4807]: I1202 21:32:58.294337 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:33:13 crc kubenswrapper[4807]: I1202 21:33:13.984384 4807 scope.go:117] "RemoveContainer" containerID="5eea71c0ed094b5bcfe6eacde1ca7c2459aa6beefa57ceec27c6f305a297e722" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.679682 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r8l29"] Dec 02 21:33:14 crc kubenswrapper[4807]: E1202 21:33:14.680608 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8" containerName="gather" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.680633 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8" containerName="gather" Dec 02 21:33:14 crc kubenswrapper[4807]: E1202 21:33:14.680655 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8" containerName="copy" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.680663 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8" containerName="copy" Dec 02 21:33:14 crc kubenswrapper[4807]: E1202 21:33:14.680684 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f218241-6a60-4ae7-a753-d7748c8a01c8" containerName="collect-profiles" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.680692 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f218241-6a60-4ae7-a753-d7748c8a01c8" containerName="collect-profiles" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.680983 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8" containerName="gather" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.681012 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f218241-6a60-4ae7-a753-d7748c8a01c8" containerName="collect-profiles" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.681028 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2f0efe7-b4f9-4faa-9f28-8c2a294d18f8" containerName="copy" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.682880 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.699489 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8l29"] Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.842487 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51598d5f-5f50-4bb5-8708-822b9c003be3-catalog-content\") pod \"community-operators-r8l29\" (UID: \"51598d5f-5f50-4bb5-8708-822b9c003be3\") " pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.842590 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5cq7\" (UniqueName: \"kubernetes.io/projected/51598d5f-5f50-4bb5-8708-822b9c003be3-kube-api-access-z5cq7\") pod \"community-operators-r8l29\" (UID: \"51598d5f-5f50-4bb5-8708-822b9c003be3\") " pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.842869 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51598d5f-5f50-4bb5-8708-822b9c003be3-utilities\") pod \"community-operators-r8l29\" (UID: \"51598d5f-5f50-4bb5-8708-822b9c003be3\") " pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.944273 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51598d5f-5f50-4bb5-8708-822b9c003be3-utilities\") pod \"community-operators-r8l29\" (UID: \"51598d5f-5f50-4bb5-8708-822b9c003be3\") " pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.944590 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51598d5f-5f50-4bb5-8708-822b9c003be3-catalog-content\") pod \"community-operators-r8l29\" (UID: \"51598d5f-5f50-4bb5-8708-822b9c003be3\") " pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.944703 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5cq7\" (UniqueName: \"kubernetes.io/projected/51598d5f-5f50-4bb5-8708-822b9c003be3-kube-api-access-z5cq7\") pod \"community-operators-r8l29\" (UID: \"51598d5f-5f50-4bb5-8708-822b9c003be3\") " pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.945295 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51598d5f-5f50-4bb5-8708-822b9c003be3-utilities\") pod \"community-operators-r8l29\" (UID: \"51598d5f-5f50-4bb5-8708-822b9c003be3\") " pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.945397 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51598d5f-5f50-4bb5-8708-822b9c003be3-catalog-content\") pod \"community-operators-r8l29\" (UID: \"51598d5f-5f50-4bb5-8708-822b9c003be3\") " pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:14 crc kubenswrapper[4807]: I1202 21:33:14.972010 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5cq7\" (UniqueName: \"kubernetes.io/projected/51598d5f-5f50-4bb5-8708-822b9c003be3-kube-api-access-z5cq7\") pod \"community-operators-r8l29\" (UID: \"51598d5f-5f50-4bb5-8708-822b9c003be3\") " pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:15 crc kubenswrapper[4807]: I1202 21:33:15.073011 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:15 crc kubenswrapper[4807]: I1202 21:33:15.614299 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8l29"] Dec 02 21:33:16 crc kubenswrapper[4807]: I1202 21:33:16.558925 4807 generic.go:334] "Generic (PLEG): container finished" podID="51598d5f-5f50-4bb5-8708-822b9c003be3" containerID="fe347db62cd885ed3292d6575225fa58a78d194bba0d07754426d45c933e5fc0" exitCode=0 Dec 02 21:33:16 crc kubenswrapper[4807]: I1202 21:33:16.558974 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8l29" event={"ID":"51598d5f-5f50-4bb5-8708-822b9c003be3","Type":"ContainerDied","Data":"fe347db62cd885ed3292d6575225fa58a78d194bba0d07754426d45c933e5fc0"} Dec 02 21:33:16 crc kubenswrapper[4807]: I1202 21:33:16.559237 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8l29" event={"ID":"51598d5f-5f50-4bb5-8708-822b9c003be3","Type":"ContainerStarted","Data":"e665d452671f626424a0d00ff0da404b87bdb7f877d01d57fa6d779c0246ccc1"} Dec 02 21:33:18 crc kubenswrapper[4807]: I1202 21:33:18.581633 4807 generic.go:334] "Generic (PLEG): container finished" podID="51598d5f-5f50-4bb5-8708-822b9c003be3" containerID="9f10a06a7f6f3fd40ad35f9e591dc02cc18d3ae9e48632135b13bcb8a5f591d0" exitCode=0 Dec 02 21:33:18 crc kubenswrapper[4807]: I1202 21:33:18.581779 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8l29" event={"ID":"51598d5f-5f50-4bb5-8708-822b9c003be3","Type":"ContainerDied","Data":"9f10a06a7f6f3fd40ad35f9e591dc02cc18d3ae9e48632135b13bcb8a5f591d0"} Dec 02 21:33:19 crc kubenswrapper[4807]: I1202 21:33:19.595275 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8l29" event={"ID":"51598d5f-5f50-4bb5-8708-822b9c003be3","Type":"ContainerStarted","Data":"146ae740942675c5e87e1f701a1f017274e029173d2cf3c35d07ce1be0387b85"} Dec 02 21:33:19 crc kubenswrapper[4807]: I1202 21:33:19.634790 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r8l29" podStartSLOduration=3.2075170809999998 podStartE2EDuration="5.634765757s" podCreationTimestamp="2025-12-02 21:33:14 +0000 UTC" firstStartedPulling="2025-12-02 21:33:16.562035215 +0000 UTC m=+5731.862942720" lastFinishedPulling="2025-12-02 21:33:18.989283901 +0000 UTC m=+5734.290191396" observedRunningTime="2025-12-02 21:33:19.621329526 +0000 UTC m=+5734.922237041" watchObservedRunningTime="2025-12-02 21:33:19.634765757 +0000 UTC m=+5734.935673262" Dec 02 21:33:25 crc kubenswrapper[4807]: I1202 21:33:25.074819 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:25 crc kubenswrapper[4807]: I1202 21:33:25.075243 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:25 crc kubenswrapper[4807]: I1202 21:33:25.600733 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:25 crc kubenswrapper[4807]: I1202 21:33:25.708278 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:25 crc kubenswrapper[4807]: I1202 21:33:25.849560 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r8l29"] Dec 02 21:33:27 crc kubenswrapper[4807]: I1202 21:33:27.680891 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r8l29" podUID="51598d5f-5f50-4bb5-8708-822b9c003be3" containerName="registry-server" containerID="cri-o://146ae740942675c5e87e1f701a1f017274e029173d2cf3c35d07ce1be0387b85" gracePeriod=2 Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.293467 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.293800 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.308473 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.447677 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5cq7\" (UniqueName: \"kubernetes.io/projected/51598d5f-5f50-4bb5-8708-822b9c003be3-kube-api-access-z5cq7\") pod \"51598d5f-5f50-4bb5-8708-822b9c003be3\" (UID: \"51598d5f-5f50-4bb5-8708-822b9c003be3\") " Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.448061 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51598d5f-5f50-4bb5-8708-822b9c003be3-utilities\") pod \"51598d5f-5f50-4bb5-8708-822b9c003be3\" (UID: \"51598d5f-5f50-4bb5-8708-822b9c003be3\") " Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.448097 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51598d5f-5f50-4bb5-8708-822b9c003be3-catalog-content\") pod \"51598d5f-5f50-4bb5-8708-822b9c003be3\" (UID: \"51598d5f-5f50-4bb5-8708-822b9c003be3\") " Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.449363 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51598d5f-5f50-4bb5-8708-822b9c003be3-utilities" (OuterVolumeSpecName: "utilities") pod "51598d5f-5f50-4bb5-8708-822b9c003be3" (UID: "51598d5f-5f50-4bb5-8708-822b9c003be3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.453903 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51598d5f-5f50-4bb5-8708-822b9c003be3-kube-api-access-z5cq7" (OuterVolumeSpecName: "kube-api-access-z5cq7") pod "51598d5f-5f50-4bb5-8708-822b9c003be3" (UID: "51598d5f-5f50-4bb5-8708-822b9c003be3"). InnerVolumeSpecName "kube-api-access-z5cq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.551163 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5cq7\" (UniqueName: \"kubernetes.io/projected/51598d5f-5f50-4bb5-8708-822b9c003be3-kube-api-access-z5cq7\") on node \"crc\" DevicePath \"\"" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.551197 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51598d5f-5f50-4bb5-8708-822b9c003be3-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.654134 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51598d5f-5f50-4bb5-8708-822b9c003be3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51598d5f-5f50-4bb5-8708-822b9c003be3" (UID: "51598d5f-5f50-4bb5-8708-822b9c003be3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.695120 4807 generic.go:334] "Generic (PLEG): container finished" podID="51598d5f-5f50-4bb5-8708-822b9c003be3" containerID="146ae740942675c5e87e1f701a1f017274e029173d2cf3c35d07ce1be0387b85" exitCode=0 Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.695185 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8l29" event={"ID":"51598d5f-5f50-4bb5-8708-822b9c003be3","Type":"ContainerDied","Data":"146ae740942675c5e87e1f701a1f017274e029173d2cf3c35d07ce1be0387b85"} Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.695218 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8l29" event={"ID":"51598d5f-5f50-4bb5-8708-822b9c003be3","Type":"ContainerDied","Data":"e665d452671f626424a0d00ff0da404b87bdb7f877d01d57fa6d779c0246ccc1"} Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.695230 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8l29" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.695243 4807 scope.go:117] "RemoveContainer" containerID="146ae740942675c5e87e1f701a1f017274e029173d2cf3c35d07ce1be0387b85" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.726044 4807 scope.go:117] "RemoveContainer" containerID="9f10a06a7f6f3fd40ad35f9e591dc02cc18d3ae9e48632135b13bcb8a5f591d0" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.753634 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r8l29"] Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.759297 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51598d5f-5f50-4bb5-8708-822b9c003be3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.765043 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r8l29"] Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.781909 4807 scope.go:117] "RemoveContainer" containerID="fe347db62cd885ed3292d6575225fa58a78d194bba0d07754426d45c933e5fc0" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.813855 4807 scope.go:117] "RemoveContainer" containerID="146ae740942675c5e87e1f701a1f017274e029173d2cf3c35d07ce1be0387b85" Dec 02 21:33:28 crc kubenswrapper[4807]: E1202 21:33:28.814223 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"146ae740942675c5e87e1f701a1f017274e029173d2cf3c35d07ce1be0387b85\": container with ID starting with 146ae740942675c5e87e1f701a1f017274e029173d2cf3c35d07ce1be0387b85 not found: ID does not exist" containerID="146ae740942675c5e87e1f701a1f017274e029173d2cf3c35d07ce1be0387b85" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.814255 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"146ae740942675c5e87e1f701a1f017274e029173d2cf3c35d07ce1be0387b85"} err="failed to get container status \"146ae740942675c5e87e1f701a1f017274e029173d2cf3c35d07ce1be0387b85\": rpc error: code = NotFound desc = could not find container \"146ae740942675c5e87e1f701a1f017274e029173d2cf3c35d07ce1be0387b85\": container with ID starting with 146ae740942675c5e87e1f701a1f017274e029173d2cf3c35d07ce1be0387b85 not found: ID does not exist" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.814275 4807 scope.go:117] "RemoveContainer" containerID="9f10a06a7f6f3fd40ad35f9e591dc02cc18d3ae9e48632135b13bcb8a5f591d0" Dec 02 21:33:28 crc kubenswrapper[4807]: E1202 21:33:28.814765 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f10a06a7f6f3fd40ad35f9e591dc02cc18d3ae9e48632135b13bcb8a5f591d0\": container with ID starting with 9f10a06a7f6f3fd40ad35f9e591dc02cc18d3ae9e48632135b13bcb8a5f591d0 not found: ID does not exist" containerID="9f10a06a7f6f3fd40ad35f9e591dc02cc18d3ae9e48632135b13bcb8a5f591d0" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.814786 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f10a06a7f6f3fd40ad35f9e591dc02cc18d3ae9e48632135b13bcb8a5f591d0"} err="failed to get container status \"9f10a06a7f6f3fd40ad35f9e591dc02cc18d3ae9e48632135b13bcb8a5f591d0\": rpc error: code = NotFound desc = could not find container \"9f10a06a7f6f3fd40ad35f9e591dc02cc18d3ae9e48632135b13bcb8a5f591d0\": container with ID starting with 9f10a06a7f6f3fd40ad35f9e591dc02cc18d3ae9e48632135b13bcb8a5f591d0 not found: ID does not exist" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.814799 4807 scope.go:117] "RemoveContainer" containerID="fe347db62cd885ed3292d6575225fa58a78d194bba0d07754426d45c933e5fc0" Dec 02 21:33:28 crc kubenswrapper[4807]: E1202 21:33:28.815149 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe347db62cd885ed3292d6575225fa58a78d194bba0d07754426d45c933e5fc0\": container with ID starting with fe347db62cd885ed3292d6575225fa58a78d194bba0d07754426d45c933e5fc0 not found: ID does not exist" containerID="fe347db62cd885ed3292d6575225fa58a78d194bba0d07754426d45c933e5fc0" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.815182 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe347db62cd885ed3292d6575225fa58a78d194bba0d07754426d45c933e5fc0"} err="failed to get container status \"fe347db62cd885ed3292d6575225fa58a78d194bba0d07754426d45c933e5fc0\": rpc error: code = NotFound desc = could not find container \"fe347db62cd885ed3292d6575225fa58a78d194bba0d07754426d45c933e5fc0\": container with ID starting with fe347db62cd885ed3292d6575225fa58a78d194bba0d07754426d45c933e5fc0 not found: ID does not exist" Dec 02 21:33:28 crc kubenswrapper[4807]: I1202 21:33:28.984550 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51598d5f-5f50-4bb5-8708-822b9c003be3" path="/var/lib/kubelet/pods/51598d5f-5f50-4bb5-8708-822b9c003be3/volumes" Dec 02 21:33:58 crc kubenswrapper[4807]: I1202 21:33:58.292662 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:33:58 crc kubenswrapper[4807]: I1202 21:33:58.293170 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:33:58 crc kubenswrapper[4807]: I1202 21:33:58.293220 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 21:33:58 crc kubenswrapper[4807]: I1202 21:33:58.293984 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 21:33:58 crc kubenswrapper[4807]: I1202 21:33:58.294034 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" gracePeriod=600 Dec 02 21:33:58 crc kubenswrapper[4807]: E1202 21:33:58.426086 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:33:59 crc kubenswrapper[4807]: I1202 21:33:59.087815 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" exitCode=0 Dec 02 21:33:59 crc kubenswrapper[4807]: I1202 21:33:59.087876 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96"} Dec 02 21:33:59 crc kubenswrapper[4807]: I1202 21:33:59.088591 4807 scope.go:117] "RemoveContainer" containerID="ea004cd87b9681bdd1d161f6f120895bdeed10d1a2a654222855ee3a5eb7c083" Dec 02 21:33:59 crc kubenswrapper[4807]: I1202 21:33:59.089386 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:33:59 crc kubenswrapper[4807]: E1202 21:33:59.089859 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:34:10 crc kubenswrapper[4807]: I1202 21:34:10.972427 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:34:10 crc kubenswrapper[4807]: E1202 21:34:10.973516 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:34:24 crc kubenswrapper[4807]: I1202 21:34:24.992995 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:34:24 crc kubenswrapper[4807]: E1202 21:34:24.995146 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:34:36 crc kubenswrapper[4807]: I1202 21:34:36.972879 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:34:36 crc kubenswrapper[4807]: E1202 21:34:36.974231 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:34:49 crc kubenswrapper[4807]: I1202 21:34:49.973087 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:34:49 crc kubenswrapper[4807]: E1202 21:34:49.974277 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:35:04 crc kubenswrapper[4807]: I1202 21:35:04.981420 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:35:04 crc kubenswrapper[4807]: E1202 21:35:04.982560 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.102066 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r596x/must-gather-9cmtg"] Dec 02 21:35:17 crc kubenswrapper[4807]: E1202 21:35:17.103486 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51598d5f-5f50-4bb5-8708-822b9c003be3" containerName="registry-server" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.103501 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="51598d5f-5f50-4bb5-8708-822b9c003be3" containerName="registry-server" Dec 02 21:35:17 crc kubenswrapper[4807]: E1202 21:35:17.103514 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51598d5f-5f50-4bb5-8708-822b9c003be3" containerName="extract-utilities" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.103520 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="51598d5f-5f50-4bb5-8708-822b9c003be3" containerName="extract-utilities" Dec 02 21:35:17 crc kubenswrapper[4807]: E1202 21:35:17.103547 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51598d5f-5f50-4bb5-8708-822b9c003be3" containerName="extract-content" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.103554 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="51598d5f-5f50-4bb5-8708-822b9c003be3" containerName="extract-content" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.103843 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="51598d5f-5f50-4bb5-8708-822b9c003be3" containerName="registry-server" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.104959 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r596x/must-gather-9cmtg" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.111426 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-r596x"/"kube-root-ca.crt" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.112038 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-r596x"/"default-dockercfg-khv2n" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.112262 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-r596x"/"openshift-service-ca.crt" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.140328 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r596x/must-gather-9cmtg"] Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.209726 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7chbm\" (UniqueName: \"kubernetes.io/projected/2c806d6b-600e-4ee3-8e8b-cbabb5ddb165-kube-api-access-7chbm\") pod \"must-gather-9cmtg\" (UID: \"2c806d6b-600e-4ee3-8e8b-cbabb5ddb165\") " pod="openshift-must-gather-r596x/must-gather-9cmtg" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.209794 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c806d6b-600e-4ee3-8e8b-cbabb5ddb165-must-gather-output\") pod \"must-gather-9cmtg\" (UID: \"2c806d6b-600e-4ee3-8e8b-cbabb5ddb165\") " pod="openshift-must-gather-r596x/must-gather-9cmtg" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.312361 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7chbm\" (UniqueName: \"kubernetes.io/projected/2c806d6b-600e-4ee3-8e8b-cbabb5ddb165-kube-api-access-7chbm\") pod \"must-gather-9cmtg\" (UID: \"2c806d6b-600e-4ee3-8e8b-cbabb5ddb165\") " pod="openshift-must-gather-r596x/must-gather-9cmtg" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.312431 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c806d6b-600e-4ee3-8e8b-cbabb5ddb165-must-gather-output\") pod \"must-gather-9cmtg\" (UID: \"2c806d6b-600e-4ee3-8e8b-cbabb5ddb165\") " pod="openshift-must-gather-r596x/must-gather-9cmtg" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.312998 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c806d6b-600e-4ee3-8e8b-cbabb5ddb165-must-gather-output\") pod \"must-gather-9cmtg\" (UID: \"2c806d6b-600e-4ee3-8e8b-cbabb5ddb165\") " pod="openshift-must-gather-r596x/must-gather-9cmtg" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.342839 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7chbm\" (UniqueName: \"kubernetes.io/projected/2c806d6b-600e-4ee3-8e8b-cbabb5ddb165-kube-api-access-7chbm\") pod \"must-gather-9cmtg\" (UID: \"2c806d6b-600e-4ee3-8e8b-cbabb5ddb165\") " pod="openshift-must-gather-r596x/must-gather-9cmtg" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.432327 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r596x/must-gather-9cmtg" Dec 02 21:35:17 crc kubenswrapper[4807]: I1202 21:35:17.982821 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r596x/must-gather-9cmtg"] Dec 02 21:35:18 crc kubenswrapper[4807]: I1202 21:35:18.440772 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r596x/must-gather-9cmtg" event={"ID":"2c806d6b-600e-4ee3-8e8b-cbabb5ddb165","Type":"ContainerStarted","Data":"e1ddf88e5a8a1cb1223500bd22fdd9599f635bd675451afebb109a9067548a25"} Dec 02 21:35:18 crc kubenswrapper[4807]: I1202 21:35:18.441196 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r596x/must-gather-9cmtg" event={"ID":"2c806d6b-600e-4ee3-8e8b-cbabb5ddb165","Type":"ContainerStarted","Data":"60510b2e55ed4aaa43c4ff1fcefe0fdb1923aab05c688afb8d1c6b67865c986e"} Dec 02 21:35:18 crc kubenswrapper[4807]: I1202 21:35:18.973130 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:35:18 crc kubenswrapper[4807]: E1202 21:35:18.973866 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:35:19 crc kubenswrapper[4807]: I1202 21:35:19.454664 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r596x/must-gather-9cmtg" event={"ID":"2c806d6b-600e-4ee3-8e8b-cbabb5ddb165","Type":"ContainerStarted","Data":"15127fbaf633f9398b6486d4a1d3d48d56b1cccc96e38e02c882c2189c529ada"} Dec 02 21:35:19 crc kubenswrapper[4807]: I1202 21:35:19.480496 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r596x/must-gather-9cmtg" podStartSLOduration=2.480471831 podStartE2EDuration="2.480471831s" podCreationTimestamp="2025-12-02 21:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 21:35:19.472887196 +0000 UTC m=+5854.773794691" watchObservedRunningTime="2025-12-02 21:35:19.480471831 +0000 UTC m=+5854.781379326" Dec 02 21:35:22 crc kubenswrapper[4807]: I1202 21:35:22.273323 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r596x/crc-debug-mzkf2"] Dec 02 21:35:22 crc kubenswrapper[4807]: I1202 21:35:22.275045 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r596x/crc-debug-mzkf2" Dec 02 21:35:22 crc kubenswrapper[4807]: I1202 21:35:22.336381 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05e87d0f-5df3-41f4-acfa-205f16a9e631-host\") pod \"crc-debug-mzkf2\" (UID: \"05e87d0f-5df3-41f4-acfa-205f16a9e631\") " pod="openshift-must-gather-r596x/crc-debug-mzkf2" Dec 02 21:35:22 crc kubenswrapper[4807]: I1202 21:35:22.336448 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvh29\" (UniqueName: \"kubernetes.io/projected/05e87d0f-5df3-41f4-acfa-205f16a9e631-kube-api-access-dvh29\") pod \"crc-debug-mzkf2\" (UID: \"05e87d0f-5df3-41f4-acfa-205f16a9e631\") " pod="openshift-must-gather-r596x/crc-debug-mzkf2" Dec 02 21:35:22 crc kubenswrapper[4807]: I1202 21:35:22.438584 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05e87d0f-5df3-41f4-acfa-205f16a9e631-host\") pod \"crc-debug-mzkf2\" (UID: \"05e87d0f-5df3-41f4-acfa-205f16a9e631\") " pod="openshift-must-gather-r596x/crc-debug-mzkf2" Dec 02 21:35:22 crc kubenswrapper[4807]: I1202 21:35:22.438737 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvh29\" (UniqueName: \"kubernetes.io/projected/05e87d0f-5df3-41f4-acfa-205f16a9e631-kube-api-access-dvh29\") pod \"crc-debug-mzkf2\" (UID: \"05e87d0f-5df3-41f4-acfa-205f16a9e631\") " pod="openshift-must-gather-r596x/crc-debug-mzkf2" Dec 02 21:35:22 crc kubenswrapper[4807]: I1202 21:35:22.438739 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05e87d0f-5df3-41f4-acfa-205f16a9e631-host\") pod \"crc-debug-mzkf2\" (UID: \"05e87d0f-5df3-41f4-acfa-205f16a9e631\") " pod="openshift-must-gather-r596x/crc-debug-mzkf2" Dec 02 21:35:22 crc kubenswrapper[4807]: I1202 21:35:22.463479 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvh29\" (UniqueName: \"kubernetes.io/projected/05e87d0f-5df3-41f4-acfa-205f16a9e631-kube-api-access-dvh29\") pod \"crc-debug-mzkf2\" (UID: \"05e87d0f-5df3-41f4-acfa-205f16a9e631\") " pod="openshift-must-gather-r596x/crc-debug-mzkf2" Dec 02 21:35:22 crc kubenswrapper[4807]: I1202 21:35:22.593454 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r596x/crc-debug-mzkf2" Dec 02 21:35:22 crc kubenswrapper[4807]: W1202 21:35:22.628456 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05e87d0f_5df3_41f4_acfa_205f16a9e631.slice/crio-74da64f3ab3ce82c484f86ca30dc28951f58d6cc90082f610a7cba79cb0472f9 WatchSource:0}: Error finding container 74da64f3ab3ce82c484f86ca30dc28951f58d6cc90082f610a7cba79cb0472f9: Status 404 returned error can't find the container with id 74da64f3ab3ce82c484f86ca30dc28951f58d6cc90082f610a7cba79cb0472f9 Dec 02 21:35:23 crc kubenswrapper[4807]: I1202 21:35:23.502784 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r596x/crc-debug-mzkf2" event={"ID":"05e87d0f-5df3-41f4-acfa-205f16a9e631","Type":"ContainerStarted","Data":"ae76063a4f826518852c367341478e5f52996a19842b1b905fec60f81ec2c73d"} Dec 02 21:35:23 crc kubenswrapper[4807]: I1202 21:35:23.503373 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r596x/crc-debug-mzkf2" event={"ID":"05e87d0f-5df3-41f4-acfa-205f16a9e631","Type":"ContainerStarted","Data":"74da64f3ab3ce82c484f86ca30dc28951f58d6cc90082f610a7cba79cb0472f9"} Dec 02 21:35:23 crc kubenswrapper[4807]: I1202 21:35:23.537968 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r596x/crc-debug-mzkf2" podStartSLOduration=1.537951002 podStartE2EDuration="1.537951002s" podCreationTimestamp="2025-12-02 21:35:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 21:35:23.527575207 +0000 UTC m=+5858.828482692" watchObservedRunningTime="2025-12-02 21:35:23.537951002 +0000 UTC m=+5858.838858497" Dec 02 21:35:32 crc kubenswrapper[4807]: I1202 21:35:32.971940 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:35:32 crc kubenswrapper[4807]: E1202 21:35:32.972567 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:35:44 crc kubenswrapper[4807]: I1202 21:35:44.987786 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:35:44 crc kubenswrapper[4807]: E1202 21:35:44.988794 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:35:58 crc kubenswrapper[4807]: I1202 21:35:58.972532 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:35:58 crc kubenswrapper[4807]: E1202 21:35:58.973336 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:36:03 crc kubenswrapper[4807]: I1202 21:36:03.869898 4807 generic.go:334] "Generic (PLEG): container finished" podID="05e87d0f-5df3-41f4-acfa-205f16a9e631" containerID="ae76063a4f826518852c367341478e5f52996a19842b1b905fec60f81ec2c73d" exitCode=0 Dec 02 21:36:03 crc kubenswrapper[4807]: I1202 21:36:03.870076 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r596x/crc-debug-mzkf2" event={"ID":"05e87d0f-5df3-41f4-acfa-205f16a9e631","Type":"ContainerDied","Data":"ae76063a4f826518852c367341478e5f52996a19842b1b905fec60f81ec2c73d"} Dec 02 21:36:05 crc kubenswrapper[4807]: I1202 21:36:05.005345 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r596x/crc-debug-mzkf2" Dec 02 21:36:05 crc kubenswrapper[4807]: I1202 21:36:05.039598 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r596x/crc-debug-mzkf2"] Dec 02 21:36:05 crc kubenswrapper[4807]: I1202 21:36:05.049280 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r596x/crc-debug-mzkf2"] Dec 02 21:36:05 crc kubenswrapper[4807]: I1202 21:36:05.143678 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05e87d0f-5df3-41f4-acfa-205f16a9e631-host\") pod \"05e87d0f-5df3-41f4-acfa-205f16a9e631\" (UID: \"05e87d0f-5df3-41f4-acfa-205f16a9e631\") " Dec 02 21:36:05 crc kubenswrapper[4807]: I1202 21:36:05.143909 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05e87d0f-5df3-41f4-acfa-205f16a9e631-host" (OuterVolumeSpecName: "host") pod "05e87d0f-5df3-41f4-acfa-205f16a9e631" (UID: "05e87d0f-5df3-41f4-acfa-205f16a9e631"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 21:36:05 crc kubenswrapper[4807]: I1202 21:36:05.144272 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvh29\" (UniqueName: \"kubernetes.io/projected/05e87d0f-5df3-41f4-acfa-205f16a9e631-kube-api-access-dvh29\") pod \"05e87d0f-5df3-41f4-acfa-205f16a9e631\" (UID: \"05e87d0f-5df3-41f4-acfa-205f16a9e631\") " Dec 02 21:36:05 crc kubenswrapper[4807]: I1202 21:36:05.144897 4807 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05e87d0f-5df3-41f4-acfa-205f16a9e631-host\") on node \"crc\" DevicePath \"\"" Dec 02 21:36:05 crc kubenswrapper[4807]: I1202 21:36:05.149587 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05e87d0f-5df3-41f4-acfa-205f16a9e631-kube-api-access-dvh29" (OuterVolumeSpecName: "kube-api-access-dvh29") pod "05e87d0f-5df3-41f4-acfa-205f16a9e631" (UID: "05e87d0f-5df3-41f4-acfa-205f16a9e631"). InnerVolumeSpecName "kube-api-access-dvh29". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:36:05 crc kubenswrapper[4807]: I1202 21:36:05.247373 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvh29\" (UniqueName: \"kubernetes.io/projected/05e87d0f-5df3-41f4-acfa-205f16a9e631-kube-api-access-dvh29\") on node \"crc\" DevicePath \"\"" Dec 02 21:36:05 crc kubenswrapper[4807]: I1202 21:36:05.892626 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74da64f3ab3ce82c484f86ca30dc28951f58d6cc90082f610a7cba79cb0472f9" Dec 02 21:36:05 crc kubenswrapper[4807]: I1202 21:36:05.892752 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r596x/crc-debug-mzkf2" Dec 02 21:36:06 crc kubenswrapper[4807]: I1202 21:36:06.352737 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r596x/crc-debug-nbzmw"] Dec 02 21:36:06 crc kubenswrapper[4807]: E1202 21:36:06.353157 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e87d0f-5df3-41f4-acfa-205f16a9e631" containerName="container-00" Dec 02 21:36:06 crc kubenswrapper[4807]: I1202 21:36:06.353171 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e87d0f-5df3-41f4-acfa-205f16a9e631" containerName="container-00" Dec 02 21:36:06 crc kubenswrapper[4807]: I1202 21:36:06.353392 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e87d0f-5df3-41f4-acfa-205f16a9e631" containerName="container-00" Dec 02 21:36:06 crc kubenswrapper[4807]: I1202 21:36:06.354198 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r596x/crc-debug-nbzmw" Dec 02 21:36:06 crc kubenswrapper[4807]: I1202 21:36:06.471831 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9dsr\" (UniqueName: \"kubernetes.io/projected/19875e04-53c6-45dc-89d8-d419388c07bf-kube-api-access-q9dsr\") pod \"crc-debug-nbzmw\" (UID: \"19875e04-53c6-45dc-89d8-d419388c07bf\") " pod="openshift-must-gather-r596x/crc-debug-nbzmw" Dec 02 21:36:06 crc kubenswrapper[4807]: I1202 21:36:06.472107 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/19875e04-53c6-45dc-89d8-d419388c07bf-host\") pod \"crc-debug-nbzmw\" (UID: \"19875e04-53c6-45dc-89d8-d419388c07bf\") " pod="openshift-must-gather-r596x/crc-debug-nbzmw" Dec 02 21:36:06 crc kubenswrapper[4807]: I1202 21:36:06.574156 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/19875e04-53c6-45dc-89d8-d419388c07bf-host\") pod \"crc-debug-nbzmw\" (UID: \"19875e04-53c6-45dc-89d8-d419388c07bf\") " pod="openshift-must-gather-r596x/crc-debug-nbzmw" Dec 02 21:36:06 crc kubenswrapper[4807]: I1202 21:36:06.574421 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/19875e04-53c6-45dc-89d8-d419388c07bf-host\") pod \"crc-debug-nbzmw\" (UID: \"19875e04-53c6-45dc-89d8-d419388c07bf\") " pod="openshift-must-gather-r596x/crc-debug-nbzmw" Dec 02 21:36:06 crc kubenswrapper[4807]: I1202 21:36:06.574546 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9dsr\" (UniqueName: \"kubernetes.io/projected/19875e04-53c6-45dc-89d8-d419388c07bf-kube-api-access-q9dsr\") pod \"crc-debug-nbzmw\" (UID: \"19875e04-53c6-45dc-89d8-d419388c07bf\") " pod="openshift-must-gather-r596x/crc-debug-nbzmw" Dec 02 21:36:06 crc kubenswrapper[4807]: I1202 21:36:06.602570 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9dsr\" (UniqueName: \"kubernetes.io/projected/19875e04-53c6-45dc-89d8-d419388c07bf-kube-api-access-q9dsr\") pod \"crc-debug-nbzmw\" (UID: \"19875e04-53c6-45dc-89d8-d419388c07bf\") " pod="openshift-must-gather-r596x/crc-debug-nbzmw" Dec 02 21:36:06 crc kubenswrapper[4807]: I1202 21:36:06.672201 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r596x/crc-debug-nbzmw" Dec 02 21:36:06 crc kubenswrapper[4807]: I1202 21:36:06.903818 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r596x/crc-debug-nbzmw" event={"ID":"19875e04-53c6-45dc-89d8-d419388c07bf","Type":"ContainerStarted","Data":"547e3e552c4e841b7e2943fcfaa4cd7c941426ea44f111795363f39c338f621e"} Dec 02 21:36:06 crc kubenswrapper[4807]: I1202 21:36:06.984406 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05e87d0f-5df3-41f4-acfa-205f16a9e631" path="/var/lib/kubelet/pods/05e87d0f-5df3-41f4-acfa-205f16a9e631/volumes" Dec 02 21:36:07 crc kubenswrapper[4807]: I1202 21:36:07.929199 4807 generic.go:334] "Generic (PLEG): container finished" podID="19875e04-53c6-45dc-89d8-d419388c07bf" containerID="dcb70040b795ca1fd6511702dc1289bc462e39379c472d1c97d9f57daee1808b" exitCode=0 Dec 02 21:36:07 crc kubenswrapper[4807]: I1202 21:36:07.929255 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r596x/crc-debug-nbzmw" event={"ID":"19875e04-53c6-45dc-89d8-d419388c07bf","Type":"ContainerDied","Data":"dcb70040b795ca1fd6511702dc1289bc462e39379c472d1c97d9f57daee1808b"} Dec 02 21:36:09 crc kubenswrapper[4807]: I1202 21:36:09.031288 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r596x/crc-debug-nbzmw" Dec 02 21:36:09 crc kubenswrapper[4807]: I1202 21:36:09.135615 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9dsr\" (UniqueName: \"kubernetes.io/projected/19875e04-53c6-45dc-89d8-d419388c07bf-kube-api-access-q9dsr\") pod \"19875e04-53c6-45dc-89d8-d419388c07bf\" (UID: \"19875e04-53c6-45dc-89d8-d419388c07bf\") " Dec 02 21:36:09 crc kubenswrapper[4807]: I1202 21:36:09.135915 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/19875e04-53c6-45dc-89d8-d419388c07bf-host\") pod \"19875e04-53c6-45dc-89d8-d419388c07bf\" (UID: \"19875e04-53c6-45dc-89d8-d419388c07bf\") " Dec 02 21:36:09 crc kubenswrapper[4807]: I1202 21:36:09.139126 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19875e04-53c6-45dc-89d8-d419388c07bf-host" (OuterVolumeSpecName: "host") pod "19875e04-53c6-45dc-89d8-d419388c07bf" (UID: "19875e04-53c6-45dc-89d8-d419388c07bf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 21:36:09 crc kubenswrapper[4807]: I1202 21:36:09.175509 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19875e04-53c6-45dc-89d8-d419388c07bf-kube-api-access-q9dsr" (OuterVolumeSpecName: "kube-api-access-q9dsr") pod "19875e04-53c6-45dc-89d8-d419388c07bf" (UID: "19875e04-53c6-45dc-89d8-d419388c07bf"). InnerVolumeSpecName "kube-api-access-q9dsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:36:09 crc kubenswrapper[4807]: I1202 21:36:09.241081 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9dsr\" (UniqueName: \"kubernetes.io/projected/19875e04-53c6-45dc-89d8-d419388c07bf-kube-api-access-q9dsr\") on node \"crc\" DevicePath \"\"" Dec 02 21:36:09 crc kubenswrapper[4807]: I1202 21:36:09.241117 4807 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/19875e04-53c6-45dc-89d8-d419388c07bf-host\") on node \"crc\" DevicePath \"\"" Dec 02 21:36:09 crc kubenswrapper[4807]: I1202 21:36:09.946396 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r596x/crc-debug-nbzmw" event={"ID":"19875e04-53c6-45dc-89d8-d419388c07bf","Type":"ContainerDied","Data":"547e3e552c4e841b7e2943fcfaa4cd7c941426ea44f111795363f39c338f621e"} Dec 02 21:36:09 crc kubenswrapper[4807]: I1202 21:36:09.946715 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="547e3e552c4e841b7e2943fcfaa4cd7c941426ea44f111795363f39c338f621e" Dec 02 21:36:09 crc kubenswrapper[4807]: I1202 21:36:09.946518 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r596x/crc-debug-nbzmw" Dec 02 21:36:10 crc kubenswrapper[4807]: I1202 21:36:10.233294 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r596x/crc-debug-nbzmw"] Dec 02 21:36:10 crc kubenswrapper[4807]: I1202 21:36:10.243634 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r596x/crc-debug-nbzmw"] Dec 02 21:36:10 crc kubenswrapper[4807]: I1202 21:36:10.972478 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:36:10 crc kubenswrapper[4807]: E1202 21:36:10.972937 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:36:10 crc kubenswrapper[4807]: I1202 21:36:10.982662 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19875e04-53c6-45dc-89d8-d419388c07bf" path="/var/lib/kubelet/pods/19875e04-53c6-45dc-89d8-d419388c07bf/volumes" Dec 02 21:36:11 crc kubenswrapper[4807]: I1202 21:36:11.415628 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r596x/crc-debug-bpww6"] Dec 02 21:36:11 crc kubenswrapper[4807]: E1202 21:36:11.416277 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19875e04-53c6-45dc-89d8-d419388c07bf" containerName="container-00" Dec 02 21:36:11 crc kubenswrapper[4807]: I1202 21:36:11.416292 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="19875e04-53c6-45dc-89d8-d419388c07bf" containerName="container-00" Dec 02 21:36:11 crc kubenswrapper[4807]: I1202 21:36:11.416500 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="19875e04-53c6-45dc-89d8-d419388c07bf" containerName="container-00" Dec 02 21:36:11 crc kubenswrapper[4807]: I1202 21:36:11.417131 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r596x/crc-debug-bpww6" Dec 02 21:36:11 crc kubenswrapper[4807]: I1202 21:36:11.483759 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b1c61b4-7781-4105-a39b-72fe77bcd2fc-host\") pod \"crc-debug-bpww6\" (UID: \"2b1c61b4-7781-4105-a39b-72fe77bcd2fc\") " pod="openshift-must-gather-r596x/crc-debug-bpww6" Dec 02 21:36:11 crc kubenswrapper[4807]: I1202 21:36:11.483908 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb5dv\" (UniqueName: \"kubernetes.io/projected/2b1c61b4-7781-4105-a39b-72fe77bcd2fc-kube-api-access-zb5dv\") pod \"crc-debug-bpww6\" (UID: \"2b1c61b4-7781-4105-a39b-72fe77bcd2fc\") " pod="openshift-must-gather-r596x/crc-debug-bpww6" Dec 02 21:36:11 crc kubenswrapper[4807]: I1202 21:36:11.586133 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b1c61b4-7781-4105-a39b-72fe77bcd2fc-host\") pod \"crc-debug-bpww6\" (UID: \"2b1c61b4-7781-4105-a39b-72fe77bcd2fc\") " pod="openshift-must-gather-r596x/crc-debug-bpww6" Dec 02 21:36:11 crc kubenswrapper[4807]: I1202 21:36:11.586263 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb5dv\" (UniqueName: \"kubernetes.io/projected/2b1c61b4-7781-4105-a39b-72fe77bcd2fc-kube-api-access-zb5dv\") pod \"crc-debug-bpww6\" (UID: \"2b1c61b4-7781-4105-a39b-72fe77bcd2fc\") " pod="openshift-must-gather-r596x/crc-debug-bpww6" Dec 02 21:36:11 crc kubenswrapper[4807]: I1202 21:36:11.586304 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b1c61b4-7781-4105-a39b-72fe77bcd2fc-host\") pod \"crc-debug-bpww6\" (UID: \"2b1c61b4-7781-4105-a39b-72fe77bcd2fc\") " pod="openshift-must-gather-r596x/crc-debug-bpww6" Dec 02 21:36:11 crc kubenswrapper[4807]: I1202 21:36:11.608830 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb5dv\" (UniqueName: \"kubernetes.io/projected/2b1c61b4-7781-4105-a39b-72fe77bcd2fc-kube-api-access-zb5dv\") pod \"crc-debug-bpww6\" (UID: \"2b1c61b4-7781-4105-a39b-72fe77bcd2fc\") " pod="openshift-must-gather-r596x/crc-debug-bpww6" Dec 02 21:36:11 crc kubenswrapper[4807]: I1202 21:36:11.739605 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r596x/crc-debug-bpww6" Dec 02 21:36:11 crc kubenswrapper[4807]: W1202 21:36:11.774304 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b1c61b4_7781_4105_a39b_72fe77bcd2fc.slice/crio-f25abfeb17b804584afbdab9a5fd11fca74a105eef235d946c7418694d285b35 WatchSource:0}: Error finding container f25abfeb17b804584afbdab9a5fd11fca74a105eef235d946c7418694d285b35: Status 404 returned error can't find the container with id f25abfeb17b804584afbdab9a5fd11fca74a105eef235d946c7418694d285b35 Dec 02 21:36:11 crc kubenswrapper[4807]: I1202 21:36:11.962069 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r596x/crc-debug-bpww6" event={"ID":"2b1c61b4-7781-4105-a39b-72fe77bcd2fc","Type":"ContainerStarted","Data":"f25abfeb17b804584afbdab9a5fd11fca74a105eef235d946c7418694d285b35"} Dec 02 21:36:12 crc kubenswrapper[4807]: I1202 21:36:12.974831 4807 generic.go:334] "Generic (PLEG): container finished" podID="2b1c61b4-7781-4105-a39b-72fe77bcd2fc" containerID="149b401ddf60711945c07baaa46eaafd5e7faf487ecefe993d41d3c645146882" exitCode=0 Dec 02 21:36:12 crc kubenswrapper[4807]: I1202 21:36:12.985391 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r596x/crc-debug-bpww6" event={"ID":"2b1c61b4-7781-4105-a39b-72fe77bcd2fc","Type":"ContainerDied","Data":"149b401ddf60711945c07baaa46eaafd5e7faf487ecefe993d41d3c645146882"} Dec 02 21:36:13 crc kubenswrapper[4807]: I1202 21:36:13.020235 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r596x/crc-debug-bpww6"] Dec 02 21:36:13 crc kubenswrapper[4807]: I1202 21:36:13.036215 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r596x/crc-debug-bpww6"] Dec 02 21:36:14 crc kubenswrapper[4807]: I1202 21:36:14.119293 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r596x/crc-debug-bpww6" Dec 02 21:36:14 crc kubenswrapper[4807]: I1202 21:36:14.238554 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b1c61b4-7781-4105-a39b-72fe77bcd2fc-host\") pod \"2b1c61b4-7781-4105-a39b-72fe77bcd2fc\" (UID: \"2b1c61b4-7781-4105-a39b-72fe77bcd2fc\") " Dec 02 21:36:14 crc kubenswrapper[4807]: I1202 21:36:14.238800 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb5dv\" (UniqueName: \"kubernetes.io/projected/2b1c61b4-7781-4105-a39b-72fe77bcd2fc-kube-api-access-zb5dv\") pod \"2b1c61b4-7781-4105-a39b-72fe77bcd2fc\" (UID: \"2b1c61b4-7781-4105-a39b-72fe77bcd2fc\") " Dec 02 21:36:14 crc kubenswrapper[4807]: I1202 21:36:14.239472 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b1c61b4-7781-4105-a39b-72fe77bcd2fc-host" (OuterVolumeSpecName: "host") pod "2b1c61b4-7781-4105-a39b-72fe77bcd2fc" (UID: "2b1c61b4-7781-4105-a39b-72fe77bcd2fc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 21:36:14 crc kubenswrapper[4807]: I1202 21:36:14.247657 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b1c61b4-7781-4105-a39b-72fe77bcd2fc-kube-api-access-zb5dv" (OuterVolumeSpecName: "kube-api-access-zb5dv") pod "2b1c61b4-7781-4105-a39b-72fe77bcd2fc" (UID: "2b1c61b4-7781-4105-a39b-72fe77bcd2fc"). InnerVolumeSpecName "kube-api-access-zb5dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:36:14 crc kubenswrapper[4807]: I1202 21:36:14.341905 4807 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b1c61b4-7781-4105-a39b-72fe77bcd2fc-host\") on node \"crc\" DevicePath \"\"" Dec 02 21:36:14 crc kubenswrapper[4807]: I1202 21:36:14.341950 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb5dv\" (UniqueName: \"kubernetes.io/projected/2b1c61b4-7781-4105-a39b-72fe77bcd2fc-kube-api-access-zb5dv\") on node \"crc\" DevicePath \"\"" Dec 02 21:36:14 crc kubenswrapper[4807]: I1202 21:36:14.996706 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b1c61b4-7781-4105-a39b-72fe77bcd2fc" path="/var/lib/kubelet/pods/2b1c61b4-7781-4105-a39b-72fe77bcd2fc/volumes" Dec 02 21:36:15 crc kubenswrapper[4807]: I1202 21:36:15.024420 4807 scope.go:117] "RemoveContainer" containerID="149b401ddf60711945c07baaa46eaafd5e7faf487ecefe993d41d3c645146882" Dec 02 21:36:15 crc kubenswrapper[4807]: I1202 21:36:15.024700 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r596x/crc-debug-bpww6" Dec 02 21:36:22 crc kubenswrapper[4807]: I1202 21:36:22.972966 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:36:22 crc kubenswrapper[4807]: E1202 21:36:22.973923 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:36:33 crc kubenswrapper[4807]: I1202 21:36:33.972909 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:36:33 crc kubenswrapper[4807]: E1202 21:36:33.974359 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:36:35 crc kubenswrapper[4807]: I1202 21:36:35.120002 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xl859"] Dec 02 21:36:35 crc kubenswrapper[4807]: E1202 21:36:35.120835 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b1c61b4-7781-4105-a39b-72fe77bcd2fc" containerName="container-00" Dec 02 21:36:35 crc kubenswrapper[4807]: I1202 21:36:35.120853 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b1c61b4-7781-4105-a39b-72fe77bcd2fc" containerName="container-00" Dec 02 21:36:35 crc kubenswrapper[4807]: I1202 21:36:35.121083 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b1c61b4-7781-4105-a39b-72fe77bcd2fc" containerName="container-00" Dec 02 21:36:35 crc kubenswrapper[4807]: I1202 21:36:35.122781 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:35 crc kubenswrapper[4807]: I1202 21:36:35.147762 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xl859"] Dec 02 21:36:35 crc kubenswrapper[4807]: I1202 21:36:35.174236 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d25a24-60e7-40db-af9c-16b13eb5dfc3-catalog-content\") pod \"certified-operators-xl859\" (UID: \"19d25a24-60e7-40db-af9c-16b13eb5dfc3\") " pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:35 crc kubenswrapper[4807]: I1202 21:36:35.174321 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d25a24-60e7-40db-af9c-16b13eb5dfc3-utilities\") pod \"certified-operators-xl859\" (UID: \"19d25a24-60e7-40db-af9c-16b13eb5dfc3\") " pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:35 crc kubenswrapper[4807]: I1202 21:36:35.174340 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8khs\" (UniqueName: \"kubernetes.io/projected/19d25a24-60e7-40db-af9c-16b13eb5dfc3-kube-api-access-g8khs\") pod \"certified-operators-xl859\" (UID: \"19d25a24-60e7-40db-af9c-16b13eb5dfc3\") " pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:35 crc kubenswrapper[4807]: I1202 21:36:35.275825 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d25a24-60e7-40db-af9c-16b13eb5dfc3-catalog-content\") pod \"certified-operators-xl859\" (UID: \"19d25a24-60e7-40db-af9c-16b13eb5dfc3\") " pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:35 crc kubenswrapper[4807]: I1202 21:36:35.275891 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d25a24-60e7-40db-af9c-16b13eb5dfc3-utilities\") pod \"certified-operators-xl859\" (UID: \"19d25a24-60e7-40db-af9c-16b13eb5dfc3\") " pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:35 crc kubenswrapper[4807]: I1202 21:36:35.275913 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8khs\" (UniqueName: \"kubernetes.io/projected/19d25a24-60e7-40db-af9c-16b13eb5dfc3-kube-api-access-g8khs\") pod \"certified-operators-xl859\" (UID: \"19d25a24-60e7-40db-af9c-16b13eb5dfc3\") " pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:35 crc kubenswrapper[4807]: I1202 21:36:35.276614 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d25a24-60e7-40db-af9c-16b13eb5dfc3-catalog-content\") pod \"certified-operators-xl859\" (UID: \"19d25a24-60e7-40db-af9c-16b13eb5dfc3\") " pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:35 crc kubenswrapper[4807]: I1202 21:36:35.276866 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d25a24-60e7-40db-af9c-16b13eb5dfc3-utilities\") pod \"certified-operators-xl859\" (UID: \"19d25a24-60e7-40db-af9c-16b13eb5dfc3\") " pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:35 crc kubenswrapper[4807]: I1202 21:36:35.298860 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8khs\" (UniqueName: \"kubernetes.io/projected/19d25a24-60e7-40db-af9c-16b13eb5dfc3-kube-api-access-g8khs\") pod \"certified-operators-xl859\" (UID: \"19d25a24-60e7-40db-af9c-16b13eb5dfc3\") " pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:35 crc kubenswrapper[4807]: I1202 21:36:35.456163 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:36 crc kubenswrapper[4807]: W1202 21:36:36.043779 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19d25a24_60e7_40db_af9c_16b13eb5dfc3.slice/crio-b9612b7ea18bb5c7b93baa8e6727df81c6ae62240125dbc46e592d952ad5d935 WatchSource:0}: Error finding container b9612b7ea18bb5c7b93baa8e6727df81c6ae62240125dbc46e592d952ad5d935: Status 404 returned error can't find the container with id b9612b7ea18bb5c7b93baa8e6727df81c6ae62240125dbc46e592d952ad5d935 Dec 02 21:36:36 crc kubenswrapper[4807]: I1202 21:36:36.050942 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xl859"] Dec 02 21:36:36 crc kubenswrapper[4807]: I1202 21:36:36.229396 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xl859" event={"ID":"19d25a24-60e7-40db-af9c-16b13eb5dfc3","Type":"ContainerStarted","Data":"b9612b7ea18bb5c7b93baa8e6727df81c6ae62240125dbc46e592d952ad5d935"} Dec 02 21:36:37 crc kubenswrapper[4807]: I1202 21:36:37.246285 4807 generic.go:334] "Generic (PLEG): container finished" podID="19d25a24-60e7-40db-af9c-16b13eb5dfc3" containerID="62104a3f7aa5c00face646437ee1a9dae15c80c8d4586ada2a13a5a4a08f7451" exitCode=0 Dec 02 21:36:37 crc kubenswrapper[4807]: I1202 21:36:37.246413 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xl859" event={"ID":"19d25a24-60e7-40db-af9c-16b13eb5dfc3","Type":"ContainerDied","Data":"62104a3f7aa5c00face646437ee1a9dae15c80c8d4586ada2a13a5a4a08f7451"} Dec 02 21:36:37 crc kubenswrapper[4807]: I1202 21:36:37.249714 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 21:36:39 crc kubenswrapper[4807]: I1202 21:36:39.275948 4807 generic.go:334] "Generic (PLEG): container finished" podID="19d25a24-60e7-40db-af9c-16b13eb5dfc3" containerID="162d9d4390f271a0d0b1338ac6f6a2cc253177b9e411f7d7ea00456031970229" exitCode=0 Dec 02 21:36:39 crc kubenswrapper[4807]: I1202 21:36:39.276003 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xl859" event={"ID":"19d25a24-60e7-40db-af9c-16b13eb5dfc3","Type":"ContainerDied","Data":"162d9d4390f271a0d0b1338ac6f6a2cc253177b9e411f7d7ea00456031970229"} Dec 02 21:36:40 crc kubenswrapper[4807]: I1202 21:36:40.292145 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xl859" event={"ID":"19d25a24-60e7-40db-af9c-16b13eb5dfc3","Type":"ContainerStarted","Data":"84e6f09520aee6bc66807ffa7832e8bda2c854e9fd6e9770fbe6545578c0c2f4"} Dec 02 21:36:40 crc kubenswrapper[4807]: I1202 21:36:40.331789 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xl859" podStartSLOduration=2.613910968 podStartE2EDuration="5.331761296s" podCreationTimestamp="2025-12-02 21:36:35 +0000 UTC" firstStartedPulling="2025-12-02 21:36:37.249278347 +0000 UTC m=+5932.550185882" lastFinishedPulling="2025-12-02 21:36:39.967128715 +0000 UTC m=+5935.268036210" observedRunningTime="2025-12-02 21:36:40.318916062 +0000 UTC m=+5935.619823587" watchObservedRunningTime="2025-12-02 21:36:40.331761296 +0000 UTC m=+5935.632668801" Dec 02 21:36:45 crc kubenswrapper[4807]: I1202 21:36:45.458173 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:45 crc kubenswrapper[4807]: I1202 21:36:45.458613 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:45 crc kubenswrapper[4807]: I1202 21:36:45.533294 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:46 crc kubenswrapper[4807]: I1202 21:36:46.458904 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:46 crc kubenswrapper[4807]: I1202 21:36:46.537958 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xl859"] Dec 02 21:36:47 crc kubenswrapper[4807]: I1202 21:36:47.944197 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5dff7976bd-s4t8d_0d4efb1f-6d37-4673-94fc-33623db07604/barbican-api/0.log" Dec 02 21:36:48 crc kubenswrapper[4807]: I1202 21:36:48.106076 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5dff7976bd-s4t8d_0d4efb1f-6d37-4673-94fc-33623db07604/barbican-api-log/0.log" Dec 02 21:36:48 crc kubenswrapper[4807]: I1202 21:36:48.127991 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76444866d4-7vv98_845bcc3a-0d65-4ba8-bbb0-6f95d4778851/barbican-keystone-listener/0.log" Dec 02 21:36:48 crc kubenswrapper[4807]: I1202 21:36:48.227286 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76444866d4-7vv98_845bcc3a-0d65-4ba8-bbb0-6f95d4778851/barbican-keystone-listener-log/0.log" Dec 02 21:36:48 crc kubenswrapper[4807]: I1202 21:36:48.319787 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-f5f4b885-lpj6r_926c4543-c7c5-41aa-a5ed-46035ee41498/barbican-worker/0.log" Dec 02 21:36:48 crc kubenswrapper[4807]: I1202 21:36:48.395507 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xl859" podUID="19d25a24-60e7-40db-af9c-16b13eb5dfc3" containerName="registry-server" containerID="cri-o://84e6f09520aee6bc66807ffa7832e8bda2c854e9fd6e9770fbe6545578c0c2f4" gracePeriod=2 Dec 02 21:36:48 crc kubenswrapper[4807]: I1202 21:36:48.413819 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-f5f4b885-lpj6r_926c4543-c7c5-41aa-a5ed-46035ee41498/barbican-worker-log/0.log" Dec 02 21:36:48 crc kubenswrapper[4807]: I1202 21:36:48.638659 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_10d90a40-6c28-4353-91b3-87e966ad1ac7/ceilometer-central-agent/0.log" Dec 02 21:36:48 crc kubenswrapper[4807]: I1202 21:36:48.686660 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-74b28_59148eea-351d-4d4c-ba60-e39e47372466/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:36:48 crc kubenswrapper[4807]: I1202 21:36:48.775539 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_10d90a40-6c28-4353-91b3-87e966ad1ac7/ceilometer-notification-agent/0.log" Dec 02 21:36:48 crc kubenswrapper[4807]: I1202 21:36:48.822702 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:48 crc kubenswrapper[4807]: I1202 21:36:48.856451 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_10d90a40-6c28-4353-91b3-87e966ad1ac7/proxy-httpd/0.log" Dec 02 21:36:48 crc kubenswrapper[4807]: I1202 21:36:48.933176 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_10d90a40-6c28-4353-91b3-87e966ad1ac7/sg-core/0.log" Dec 02 21:36:48 crc kubenswrapper[4807]: I1202 21:36:48.972538 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:36:48 crc kubenswrapper[4807]: E1202 21:36:48.972932 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.003293 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d25a24-60e7-40db-af9c-16b13eb5dfc3-catalog-content\") pod \"19d25a24-60e7-40db-af9c-16b13eb5dfc3\" (UID: \"19d25a24-60e7-40db-af9c-16b13eb5dfc3\") " Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.003364 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d25a24-60e7-40db-af9c-16b13eb5dfc3-utilities\") pod \"19d25a24-60e7-40db-af9c-16b13eb5dfc3\" (UID: \"19d25a24-60e7-40db-af9c-16b13eb5dfc3\") " Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.003396 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8khs\" (UniqueName: \"kubernetes.io/projected/19d25a24-60e7-40db-af9c-16b13eb5dfc3-kube-api-access-g8khs\") pod \"19d25a24-60e7-40db-af9c-16b13eb5dfc3\" (UID: \"19d25a24-60e7-40db-af9c-16b13eb5dfc3\") " Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.005522 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19d25a24-60e7-40db-af9c-16b13eb5dfc3-utilities" (OuterVolumeSpecName: "utilities") pod "19d25a24-60e7-40db-af9c-16b13eb5dfc3" (UID: "19d25a24-60e7-40db-af9c-16b13eb5dfc3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.009594 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d25a24-60e7-40db-af9c-16b13eb5dfc3-kube-api-access-g8khs" (OuterVolumeSpecName: "kube-api-access-g8khs") pod "19d25a24-60e7-40db-af9c-16b13eb5dfc3" (UID: "19d25a24-60e7-40db-af9c-16b13eb5dfc3"). InnerVolumeSpecName "kube-api-access-g8khs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.063186 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19d25a24-60e7-40db-af9c-16b13eb5dfc3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19d25a24-60e7-40db-af9c-16b13eb5dfc3" (UID: "19d25a24-60e7-40db-af9c-16b13eb5dfc3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.094798 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_06947941-0c96-4330-b2f7-bbc193dcdf61/cinder-api-log/0.log" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.105800 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19d25a24-60e7-40db-af9c-16b13eb5dfc3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.105824 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19d25a24-60e7-40db-af9c-16b13eb5dfc3-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.105833 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8khs\" (UniqueName: \"kubernetes.io/projected/19d25a24-60e7-40db-af9c-16b13eb5dfc3-kube-api-access-g8khs\") on node \"crc\" DevicePath \"\"" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.124231 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_06947941-0c96-4330-b2f7-bbc193dcdf61/cinder-api/0.log" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.291905 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_4a7b23d9-c399-44ec-995e-54726ae83774/cinder-scheduler/0.log" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.304495 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_4a7b23d9-c399-44ec-995e-54726ae83774/probe/0.log" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.371317 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-fvhdm_5f0ca7a8-13bb-43b1-8be3-cf1f9f8b0ecf/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.406161 4807 generic.go:334] "Generic (PLEG): container finished" podID="19d25a24-60e7-40db-af9c-16b13eb5dfc3" containerID="84e6f09520aee6bc66807ffa7832e8bda2c854e9fd6e9770fbe6545578c0c2f4" exitCode=0 Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.406217 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xl859" event={"ID":"19d25a24-60e7-40db-af9c-16b13eb5dfc3","Type":"ContainerDied","Data":"84e6f09520aee6bc66807ffa7832e8bda2c854e9fd6e9770fbe6545578c0c2f4"} Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.406222 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xl859" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.406253 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xl859" event={"ID":"19d25a24-60e7-40db-af9c-16b13eb5dfc3","Type":"ContainerDied","Data":"b9612b7ea18bb5c7b93baa8e6727df81c6ae62240125dbc46e592d952ad5d935"} Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.406272 4807 scope.go:117] "RemoveContainer" containerID="84e6f09520aee6bc66807ffa7832e8bda2c854e9fd6e9770fbe6545578c0c2f4" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.429731 4807 scope.go:117] "RemoveContainer" containerID="162d9d4390f271a0d0b1338ac6f6a2cc253177b9e411f7d7ea00456031970229" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.457909 4807 scope.go:117] "RemoveContainer" containerID="62104a3f7aa5c00face646437ee1a9dae15c80c8d4586ada2a13a5a4a08f7451" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.495245 4807 scope.go:117] "RemoveContainer" containerID="84e6f09520aee6bc66807ffa7832e8bda2c854e9fd6e9770fbe6545578c0c2f4" Dec 02 21:36:49 crc kubenswrapper[4807]: E1202 21:36:49.495779 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84e6f09520aee6bc66807ffa7832e8bda2c854e9fd6e9770fbe6545578c0c2f4\": container with ID starting with 84e6f09520aee6bc66807ffa7832e8bda2c854e9fd6e9770fbe6545578c0c2f4 not found: ID does not exist" containerID="84e6f09520aee6bc66807ffa7832e8bda2c854e9fd6e9770fbe6545578c0c2f4" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.497211 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84e6f09520aee6bc66807ffa7832e8bda2c854e9fd6e9770fbe6545578c0c2f4"} err="failed to get container status \"84e6f09520aee6bc66807ffa7832e8bda2c854e9fd6e9770fbe6545578c0c2f4\": rpc error: code = NotFound desc = could not find container \"84e6f09520aee6bc66807ffa7832e8bda2c854e9fd6e9770fbe6545578c0c2f4\": container with ID starting with 84e6f09520aee6bc66807ffa7832e8bda2c854e9fd6e9770fbe6545578c0c2f4 not found: ID does not exist" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.497267 4807 scope.go:117] "RemoveContainer" containerID="162d9d4390f271a0d0b1338ac6f6a2cc253177b9e411f7d7ea00456031970229" Dec 02 21:36:49 crc kubenswrapper[4807]: E1202 21:36:49.497928 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"162d9d4390f271a0d0b1338ac6f6a2cc253177b9e411f7d7ea00456031970229\": container with ID starting with 162d9d4390f271a0d0b1338ac6f6a2cc253177b9e411f7d7ea00456031970229 not found: ID does not exist" containerID="162d9d4390f271a0d0b1338ac6f6a2cc253177b9e411f7d7ea00456031970229" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.497958 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"162d9d4390f271a0d0b1338ac6f6a2cc253177b9e411f7d7ea00456031970229"} err="failed to get container status \"162d9d4390f271a0d0b1338ac6f6a2cc253177b9e411f7d7ea00456031970229\": rpc error: code = NotFound desc = could not find container \"162d9d4390f271a0d0b1338ac6f6a2cc253177b9e411f7d7ea00456031970229\": container with ID starting with 162d9d4390f271a0d0b1338ac6f6a2cc253177b9e411f7d7ea00456031970229 not found: ID does not exist" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.497980 4807 scope.go:117] "RemoveContainer" containerID="62104a3f7aa5c00face646437ee1a9dae15c80c8d4586ada2a13a5a4a08f7451" Dec 02 21:36:49 crc kubenswrapper[4807]: E1202 21:36:49.499302 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62104a3f7aa5c00face646437ee1a9dae15c80c8d4586ada2a13a5a4a08f7451\": container with ID starting with 62104a3f7aa5c00face646437ee1a9dae15c80c8d4586ada2a13a5a4a08f7451 not found: ID does not exist" containerID="62104a3f7aa5c00face646437ee1a9dae15c80c8d4586ada2a13a5a4a08f7451" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.499338 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62104a3f7aa5c00face646437ee1a9dae15c80c8d4586ada2a13a5a4a08f7451"} err="failed to get container status \"62104a3f7aa5c00face646437ee1a9dae15c80c8d4586ada2a13a5a4a08f7451\": rpc error: code = NotFound desc = could not find container \"62104a3f7aa5c00face646437ee1a9dae15c80c8d4586ada2a13a5a4a08f7451\": container with ID starting with 62104a3f7aa5c00face646437ee1a9dae15c80c8d4586ada2a13a5a4a08f7451 not found: ID does not exist" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.506984 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xl859"] Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.516273 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xl859"] Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.559191 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dw249_d72aa681-f5b3-4192-aa10-a4b6fc8519b9/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.630529 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b865b64bc-72s4n_7cb6399e-e732-4865-9d51-5f15eb42c502/init/0.log" Dec 02 21:36:49 crc kubenswrapper[4807]: E1202 21:36:49.651567 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19d25a24_60e7_40db_af9c_16b13eb5dfc3.slice\": RecentStats: unable to find data in memory cache]" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.807001 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b865b64bc-72s4n_7cb6399e-e732-4865-9d51-5f15eb42c502/init/0.log" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.902594 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-nmtfp_3399e62b-d5c6-4469-9507-75e4e922201e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:36:49 crc kubenswrapper[4807]: I1202 21:36:49.946829 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b865b64bc-72s4n_7cb6399e-e732-4865-9d51-5f15eb42c502/dnsmasq-dns/0.log" Dec 02 21:36:50 crc kubenswrapper[4807]: I1202 21:36:50.084926 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_72b4a2ac-3f2c-4064-bcd9-b40585699ab9/glance-log/0.log" Dec 02 21:36:50 crc kubenswrapper[4807]: I1202 21:36:50.101020 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_72b4a2ac-3f2c-4064-bcd9-b40585699ab9/glance-httpd/0.log" Dec 02 21:36:50 crc kubenswrapper[4807]: I1202 21:36:50.312112 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7b4380db-c1e5-4f3b-81f6-ae5a6d71119a/glance-httpd/0.log" Dec 02 21:36:50 crc kubenswrapper[4807]: I1202 21:36:50.327158 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7b4380db-c1e5-4f3b-81f6-ae5a6d71119a/glance-log/0.log" Dec 02 21:36:50 crc kubenswrapper[4807]: I1202 21:36:50.634158 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-c4h4x_64a6a7a0-63cc-48bb-a936-21fbab3123e9/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:36:50 crc kubenswrapper[4807]: I1202 21:36:50.708557 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5cbfd7dcb-hzflv_f5570109-9e91-473c-8a41-47081ace3591/horizon/0.log" Dec 02 21:36:50 crc kubenswrapper[4807]: I1202 21:36:50.990489 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d25a24-60e7-40db-af9c-16b13eb5dfc3" path="/var/lib/kubelet/pods/19d25a24-60e7-40db-af9c-16b13eb5dfc3/volumes" Dec 02 21:36:51 crc kubenswrapper[4807]: I1202 21:36:51.060966 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5cbfd7dcb-hzflv_f5570109-9e91-473c-8a41-47081ace3591/horizon-log/0.log" Dec 02 21:36:51 crc kubenswrapper[4807]: I1202 21:36:51.168990 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-5lbnt_0c2e1673-a8ae-401a-b874-d425c01fad63/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:36:51 crc kubenswrapper[4807]: I1202 21:36:51.370573 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29411821-qfgdj_1f152d73-b7a0-4142-8f65-2343fca9dc2e/keystone-cron/0.log" Dec 02 21:36:51 crc kubenswrapper[4807]: I1202 21:36:51.533705 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_a603da43-4e3b-4e75-8c4e-9e90908e2af4/kube-state-metrics/0.log" Dec 02 21:36:51 crc kubenswrapper[4807]: I1202 21:36:51.595324 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6d4cd98589-bnbn5_bcde4ab3-e62a-40bc-86b7-6d1c5e1af116/keystone-api/0.log" Dec 02 21:36:51 crc kubenswrapper[4807]: I1202 21:36:51.704756 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-hk9gf_a22ccfdd-695f-49fe-9bd9-5f1109915c63/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:36:52 crc kubenswrapper[4807]: I1202 21:36:52.117756 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-xx75x_f2cd5e17-9097-447f-8fcf-7e95a2621845/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:36:52 crc kubenswrapper[4807]: I1202 21:36:52.176565 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5597979745-dn972_a98ea655-0ec2-4d0f-951a-57f5ee9f6df2/neutron-httpd/0.log" Dec 02 21:36:52 crc kubenswrapper[4807]: I1202 21:36:52.208988 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5597979745-dn972_a98ea655-0ec2-4d0f-951a-57f5ee9f6df2/neutron-api/0.log" Dec 02 21:36:53 crc kubenswrapper[4807]: I1202 21:36:53.014049 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a6c8daa7-6ecd-42ee-9e62-e53eb3ade43a/nova-cell0-conductor-conductor/0.log" Dec 02 21:36:53 crc kubenswrapper[4807]: I1202 21:36:53.298375 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c5860289-2a92-47f1-855c-399a8c590f7f/nova-cell1-conductor-conductor/0.log" Dec 02 21:36:53 crc kubenswrapper[4807]: I1202 21:36:53.627349 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9caa6001-4e75-4042-998f-9f00f49ef173/nova-api-log/0.log" Dec 02 21:36:53 crc kubenswrapper[4807]: I1202 21:36:53.821267 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_fc1416d8-8665-48f9-ad43-b2e16b6a5ecb/nova-cell1-novncproxy-novncproxy/0.log" Dec 02 21:36:53 crc kubenswrapper[4807]: I1202 21:36:53.908297 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-48b5r_c7cb6b66-35b2-477f-8d6a-3037a6931797/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:36:54 crc kubenswrapper[4807]: I1202 21:36:54.134884 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_96fcc760-d492-4e5b-8d31-de6c7f49b47f/nova-metadata-log/0.log" Dec 02 21:36:54 crc kubenswrapper[4807]: I1202 21:36:54.246351 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9caa6001-4e75-4042-998f-9f00f49ef173/nova-api-api/0.log" Dec 02 21:36:54 crc kubenswrapper[4807]: I1202 21:36:54.600127 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_96ba9206-497c-4cd1-a16d-436d2ba285a7/mysql-bootstrap/0.log" Dec 02 21:36:54 crc kubenswrapper[4807]: I1202 21:36:54.764306 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_96ba9206-497c-4cd1-a16d-436d2ba285a7/mysql-bootstrap/0.log" Dec 02 21:36:54 crc kubenswrapper[4807]: I1202 21:36:54.846164 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_96ba9206-497c-4cd1-a16d-436d2ba285a7/galera/0.log" Dec 02 21:36:54 crc kubenswrapper[4807]: I1202 21:36:54.869222 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_3db3f424-7a28-419f-b5e1-0dec9279d417/nova-scheduler-scheduler/0.log" Dec 02 21:36:55 crc kubenswrapper[4807]: I1202 21:36:55.074150 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_df8d83f3-6675-416b-a039-2aafac45fe18/mysql-bootstrap/0.log" Dec 02 21:36:55 crc kubenswrapper[4807]: I1202 21:36:55.301778 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_df8d83f3-6675-416b-a039-2aafac45fe18/mysql-bootstrap/0.log" Dec 02 21:36:55 crc kubenswrapper[4807]: I1202 21:36:55.365345 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_df8d83f3-6675-416b-a039-2aafac45fe18/galera/0.log" Dec 02 21:36:55 crc kubenswrapper[4807]: I1202 21:36:55.524850 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_0f6c2f22-8527-4428-a503-7aedd5635e6b/openstackclient/0.log" Dec 02 21:36:55 crc kubenswrapper[4807]: I1202 21:36:55.662155 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-kstxp_80844aa2-667c-4b2d-a55a-e5fa2cd3dd85/ovn-controller/0.log" Dec 02 21:36:55 crc kubenswrapper[4807]: I1202 21:36:55.877986 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-c2pcc_0ee2869c-161d-442b-81a3-b3790ab8cdfe/openstack-network-exporter/0.log" Dec 02 21:36:56 crc kubenswrapper[4807]: I1202 21:36:56.050208 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pxxrz_f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd/ovsdb-server-init/0.log" Dec 02 21:36:56 crc kubenswrapper[4807]: I1202 21:36:56.274364 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pxxrz_f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd/ovsdb-server-init/0.log" Dec 02 21:36:56 crc kubenswrapper[4807]: I1202 21:36:56.275411 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pxxrz_f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd/ovs-vswitchd/0.log" Dec 02 21:36:56 crc kubenswrapper[4807]: I1202 21:36:56.303342 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pxxrz_f3e74bd8-f6ab-439d-b7b1-afa538fbbbcd/ovsdb-server/0.log" Dec 02 21:36:56 crc kubenswrapper[4807]: I1202 21:36:56.359697 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_96fcc760-d492-4e5b-8d31-de6c7f49b47f/nova-metadata-metadata/0.log" Dec 02 21:36:56 crc kubenswrapper[4807]: I1202 21:36:56.493113 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-9q5fk_8b4b56e1-9070-4a42-beff-c3d9324e820c/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:36:56 crc kubenswrapper[4807]: I1202 21:36:56.624359 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_06539675-3505-4f57-bdfd-54ccdb96d90a/openstack-network-exporter/0.log" Dec 02 21:36:56 crc kubenswrapper[4807]: I1202 21:36:56.908657 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_06539675-3505-4f57-bdfd-54ccdb96d90a/ovn-northd/0.log" Dec 02 21:36:57 crc kubenswrapper[4807]: I1202 21:36:57.043937 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7a63890f-30c8-4538-903a-121488dba6bb/openstack-network-exporter/0.log" Dec 02 21:36:57 crc kubenswrapper[4807]: I1202 21:36:57.045181 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7a63890f-30c8-4538-903a-121488dba6bb/ovsdbserver-nb/0.log" Dec 02 21:36:57 crc kubenswrapper[4807]: I1202 21:36:57.276240 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_47753529-368a-4c5c-a3f8-27ffd55e41d1/ovsdbserver-sb/0.log" Dec 02 21:36:57 crc kubenswrapper[4807]: I1202 21:36:57.327734 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_47753529-368a-4c5c-a3f8-27ffd55e41d1/openstack-network-exporter/0.log" Dec 02 21:36:57 crc kubenswrapper[4807]: I1202 21:36:57.587637 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_35ee0b99-6360-4b6d-bc80-8b420b1054c0/init-config-reloader/0.log" Dec 02 21:36:57 crc kubenswrapper[4807]: I1202 21:36:57.656000 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-844d66f984-gvswh_cd577cf0-d4de-4a57-9254-8a7bf61aa686/placement-api/0.log" Dec 02 21:36:57 crc kubenswrapper[4807]: I1202 21:36:57.747011 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-844d66f984-gvswh_cd577cf0-d4de-4a57-9254-8a7bf61aa686/placement-log/0.log" Dec 02 21:36:57 crc kubenswrapper[4807]: I1202 21:36:57.879536 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_35ee0b99-6360-4b6d-bc80-8b420b1054c0/config-reloader/0.log" Dec 02 21:36:57 crc kubenswrapper[4807]: I1202 21:36:57.918198 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_35ee0b99-6360-4b6d-bc80-8b420b1054c0/init-config-reloader/0.log" Dec 02 21:36:57 crc kubenswrapper[4807]: I1202 21:36:57.966507 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_35ee0b99-6360-4b6d-bc80-8b420b1054c0/thanos-sidecar/0.log" Dec 02 21:36:57 crc kubenswrapper[4807]: I1202 21:36:57.994391 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_35ee0b99-6360-4b6d-bc80-8b420b1054c0/prometheus/0.log" Dec 02 21:36:58 crc kubenswrapper[4807]: I1202 21:36:58.139559 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_748ead81-bff5-4a69-9398-4e3c91be5979/setup-container/0.log" Dec 02 21:36:58 crc kubenswrapper[4807]: I1202 21:36:58.470736 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8593e062-85d8-4f22-88b4-eb7cf5654859/setup-container/0.log" Dec 02 21:36:58 crc kubenswrapper[4807]: I1202 21:36:58.471603 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_748ead81-bff5-4a69-9398-4e3c91be5979/setup-container/0.log" Dec 02 21:36:58 crc kubenswrapper[4807]: I1202 21:36:58.498442 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_748ead81-bff5-4a69-9398-4e3c91be5979/rabbitmq/0.log" Dec 02 21:36:58 crc kubenswrapper[4807]: I1202 21:36:58.768953 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8593e062-85d8-4f22-88b4-eb7cf5654859/setup-container/0.log" Dec 02 21:36:58 crc kubenswrapper[4807]: I1202 21:36:58.779362 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-l8dv6_89a829f1-32e4-4b5b-ba48-196916b1da6f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:36:58 crc kubenswrapper[4807]: I1202 21:36:58.794187 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_8593e062-85d8-4f22-88b4-eb7cf5654859/rabbitmq/0.log" Dec 02 21:36:59 crc kubenswrapper[4807]: I1202 21:36:59.025331 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-vxgdk_8483568b-e11d-4aea-8fcb-1925d2e64fa2/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:36:59 crc kubenswrapper[4807]: I1202 21:36:59.059991 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-89mf9_2548dec0-ad57-411d-891a-0b847b25a4bb/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:36:59 crc kubenswrapper[4807]: I1202 21:36:59.313188 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-pzgnd_94998bd3-5f5b-47cd-b14c-39e55cb78eaa/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:36:59 crc kubenswrapper[4807]: I1202 21:36:59.373487 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-8d2sc_d0bb0a20-106f-412e-8ba3-b218bacdadf5/ssh-known-hosts-edpm-deployment/0.log" Dec 02 21:36:59 crc kubenswrapper[4807]: I1202 21:36:59.673373 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9c96f4455-bvlsr_60cf7565-bc2c-469d-a0ad-400e95d69528/proxy-server/0.log" Dec 02 21:36:59 crc kubenswrapper[4807]: I1202 21:36:59.740644 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-msbf5_a3dd1529-ab40-486b-8458-e3d1afc9a0e2/swift-ring-rebalance/0.log" Dec 02 21:36:59 crc kubenswrapper[4807]: I1202 21:36:59.751117 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9c96f4455-bvlsr_60cf7565-bc2c-469d-a0ad-400e95d69528/proxy-httpd/0.log" Dec 02 21:36:59 crc kubenswrapper[4807]: I1202 21:36:59.857497 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/account-auditor/0.log" Dec 02 21:37:00 crc kubenswrapper[4807]: I1202 21:37:00.147593 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/account-reaper/0.log" Dec 02 21:37:00 crc kubenswrapper[4807]: I1202 21:37:00.241681 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/account-replicator/0.log" Dec 02 21:37:00 crc kubenswrapper[4807]: I1202 21:37:00.353067 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/account-server/0.log" Dec 02 21:37:00 crc kubenswrapper[4807]: I1202 21:37:00.366091 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/container-auditor/0.log" Dec 02 21:37:00 crc kubenswrapper[4807]: I1202 21:37:00.437310 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/container-replicator/0.log" Dec 02 21:37:00 crc kubenswrapper[4807]: I1202 21:37:00.545198 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/container-server/0.log" Dec 02 21:37:00 crc kubenswrapper[4807]: I1202 21:37:00.628975 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/container-updater/0.log" Dec 02 21:37:00 crc kubenswrapper[4807]: I1202 21:37:00.632361 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/object-auditor/0.log" Dec 02 21:37:00 crc kubenswrapper[4807]: I1202 21:37:00.692318 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/object-expirer/0.log" Dec 02 21:37:00 crc kubenswrapper[4807]: I1202 21:37:00.764495 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/object-replicator/0.log" Dec 02 21:37:00 crc kubenswrapper[4807]: I1202 21:37:00.865894 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/object-server/0.log" Dec 02 21:37:00 crc kubenswrapper[4807]: I1202 21:37:00.907431 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/object-updater/0.log" Dec 02 21:37:00 crc kubenswrapper[4807]: I1202 21:37:00.948316 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/rsync/0.log" Dec 02 21:37:01 crc kubenswrapper[4807]: I1202 21:37:01.042824 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_caaccc0b-6743-4907-9d87-f4ab26c931e2/swift-recon-cron/0.log" Dec 02 21:37:01 crc kubenswrapper[4807]: I1202 21:37:01.179009 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-s86vg_c5be8c89-b466-4c89-aecd-548b6d5d19ae/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:37:01 crc kubenswrapper[4807]: I1202 21:37:01.272708 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e6826607-5100-439e-b82d-224b312a6faa/tempest-tests-tempest-tests-runner/0.log" Dec 02 21:37:01 crc kubenswrapper[4807]: I1202 21:37:01.452665 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_f0d7af3c-3d32-47d8-ab5b-3293bd5e26eb/test-operator-logs-container/0.log" Dec 02 21:37:01 crc kubenswrapper[4807]: I1202 21:37:01.601373 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-tb6hj_f741a266-b127-46cc-8304-9aedd57f07b5/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 21:37:02 crc kubenswrapper[4807]: I1202 21:37:02.241880 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_1d68c545-0435-4f66-a351-3ccba6fa68a3/watcher-applier/0.log" Dec 02 21:37:02 crc kubenswrapper[4807]: I1202 21:37:02.703213 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_bc49afa3-486b-481f-bb06-5b9bb2701021/watcher-api-log/0.log" Dec 02 21:37:02 crc kubenswrapper[4807]: I1202 21:37:02.972816 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:37:02 crc kubenswrapper[4807]: E1202 21:37:02.973091 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:37:03 crc kubenswrapper[4807]: I1202 21:37:03.414736 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_05674c10-e8c2-4ab8-9d80-185c9b814c9c/watcher-decision-engine/0.log" Dec 02 21:37:05 crc kubenswrapper[4807]: I1202 21:37:05.850341 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_bc49afa3-486b-481f-bb06-5b9bb2701021/watcher-api/0.log" Dec 02 21:37:06 crc kubenswrapper[4807]: I1202 21:37:06.255754 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_4225655b-c174-479e-a740-b768c9801287/memcached/0.log" Dec 02 21:37:17 crc kubenswrapper[4807]: I1202 21:37:17.972597 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:37:17 crc kubenswrapper[4807]: E1202 21:37:17.974426 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:37:28 crc kubenswrapper[4807]: I1202 21:37:28.435064 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd_94bd7c70-1bb7-4b0e-816e-5be2df3641da/util/0.log" Dec 02 21:37:28 crc kubenswrapper[4807]: I1202 21:37:28.583393 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd_94bd7c70-1bb7-4b0e-816e-5be2df3641da/util/0.log" Dec 02 21:37:28 crc kubenswrapper[4807]: I1202 21:37:28.591260 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd_94bd7c70-1bb7-4b0e-816e-5be2df3641da/pull/0.log" Dec 02 21:37:28 crc kubenswrapper[4807]: I1202 21:37:28.621324 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd_94bd7c70-1bb7-4b0e-816e-5be2df3641da/pull/0.log" Dec 02 21:37:28 crc kubenswrapper[4807]: I1202 21:37:28.771760 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd_94bd7c70-1bb7-4b0e-816e-5be2df3641da/pull/0.log" Dec 02 21:37:28 crc kubenswrapper[4807]: I1202 21:37:28.787043 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd_94bd7c70-1bb7-4b0e-816e-5be2df3641da/util/0.log" Dec 02 21:37:28 crc kubenswrapper[4807]: I1202 21:37:28.840249 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9b5ee995779f0294dadcb6c28f1abd010256202e791862117d54e55394bjbzd_94bd7c70-1bb7-4b0e-816e-5be2df3641da/extract/0.log" Dec 02 21:37:29 crc kubenswrapper[4807]: I1202 21:37:29.033296 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-r9jtf_7c007dd6-7efa-47c1-af56-ce0bf8fd6f37/kube-rbac-proxy/0.log" Dec 02 21:37:29 crc kubenswrapper[4807]: I1202 21:37:29.037447 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-g2lf7_3982be8e-b5d2-4795-9312-f3ba8466209c/kube-rbac-proxy/0.log" Dec 02 21:37:29 crc kubenswrapper[4807]: I1202 21:37:29.083293 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-r9jtf_7c007dd6-7efa-47c1-af56-ce0bf8fd6f37/manager/0.log" Dec 02 21:37:29 crc kubenswrapper[4807]: I1202 21:37:29.239785 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-g2lf7_3982be8e-b5d2-4795-9312-f3ba8466209c/manager/0.log" Dec 02 21:37:29 crc kubenswrapper[4807]: I1202 21:37:29.301378 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-p2b2h_357ab26f-5ac4-46a1-b8f3-89db969b4082/kube-rbac-proxy/0.log" Dec 02 21:37:29 crc kubenswrapper[4807]: I1202 21:37:29.373212 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-p2b2h_357ab26f-5ac4-46a1-b8f3-89db969b4082/manager/0.log" Dec 02 21:37:29 crc kubenswrapper[4807]: I1202 21:37:29.473390 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-mss4c_cfb9049e-2275-4d96-9131-29bb4def714b/kube-rbac-proxy/0.log" Dec 02 21:37:29 crc kubenswrapper[4807]: I1202 21:37:29.640225 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-mss4c_cfb9049e-2275-4d96-9131-29bb4def714b/manager/0.log" Dec 02 21:37:29 crc kubenswrapper[4807]: I1202 21:37:29.690707 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-kq246_3c3d38aa-3600-41f6-97b3-e3699796526e/manager/0.log" Dec 02 21:37:29 crc kubenswrapper[4807]: I1202 21:37:29.729598 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-kq246_3c3d38aa-3600-41f6-97b3-e3699796526e/kube-rbac-proxy/0.log" Dec 02 21:37:30 crc kubenswrapper[4807]: I1202 21:37:30.012222 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-6mvz2_f376fe60-0cdf-4b30-ab61-80178d738ea4/kube-rbac-proxy/0.log" Dec 02 21:37:30 crc kubenswrapper[4807]: I1202 21:37:30.035367 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-6mvz2_f376fe60-0cdf-4b30-ab61-80178d738ea4/manager/0.log" Dec 02 21:37:30 crc kubenswrapper[4807]: I1202 21:37:30.386977 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-dn7v2_d7712aec-0995-489a-8cee-7e68fbf130df/kube-rbac-proxy/0.log" Dec 02 21:37:30 crc kubenswrapper[4807]: I1202 21:37:30.414434 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-dn7v2_d7712aec-0995-489a-8cee-7e68fbf130df/manager/0.log" Dec 02 21:37:30 crc kubenswrapper[4807]: I1202 21:37:30.489680 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-tz65v_5ceaf50b-92b7-4069-b9b8-660e90c55d97/kube-rbac-proxy/0.log" Dec 02 21:37:30 crc kubenswrapper[4807]: I1202 21:37:30.518881 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-tz65v_5ceaf50b-92b7-4069-b9b8-660e90c55d97/manager/0.log" Dec 02 21:37:30 crc kubenswrapper[4807]: I1202 21:37:30.678674 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-gmwq8_9198e60c-4301-40b6-9d1b-3e91a2f10fa5/kube-rbac-proxy/0.log" Dec 02 21:37:30 crc kubenswrapper[4807]: I1202 21:37:30.738523 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-gmwq8_9198e60c-4301-40b6-9d1b-3e91a2f10fa5/manager/0.log" Dec 02 21:37:30 crc kubenswrapper[4807]: I1202 21:37:30.920271 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-wzdmc_fa1a5516-5c0d-4d4e-b052-d9301371a2d3/kube-rbac-proxy/0.log" Dec 02 21:37:30 crc kubenswrapper[4807]: I1202 21:37:30.924646 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-wzdmc_fa1a5516-5c0d-4d4e-b052-d9301371a2d3/manager/0.log" Dec 02 21:37:30 crc kubenswrapper[4807]: I1202 21:37:30.999698 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-52v8m_490b3442-f4b4-493d-824a-67e370ac26f9/kube-rbac-proxy/0.log" Dec 02 21:37:31 crc kubenswrapper[4807]: I1202 21:37:31.131108 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-52v8m_490b3442-f4b4-493d-824a-67e370ac26f9/manager/0.log" Dec 02 21:37:31 crc kubenswrapper[4807]: I1202 21:37:31.158404 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-gd4bj_7cccb577-4849-4e1c-b38e-669f7658eb2e/kube-rbac-proxy/0.log" Dec 02 21:37:31 crc kubenswrapper[4807]: I1202 21:37:31.252897 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-gd4bj_7cccb577-4849-4e1c-b38e-669f7658eb2e/manager/0.log" Dec 02 21:37:31 crc kubenswrapper[4807]: I1202 21:37:31.354981 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-g8559_6511dc8d-00b4-4937-a330-0f5cf9c06fdd/kube-rbac-proxy/0.log" Dec 02 21:37:31 crc kubenswrapper[4807]: I1202 21:37:31.420998 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-g8559_6511dc8d-00b4-4937-a330-0f5cf9c06fdd/manager/0.log" Dec 02 21:37:31 crc kubenswrapper[4807]: I1202 21:37:31.508626 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-9g547_1f4141f5-ba14-4c49-b114-07e5d506b255/kube-rbac-proxy/0.log" Dec 02 21:37:31 crc kubenswrapper[4807]: I1202 21:37:31.552601 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-9g547_1f4141f5-ba14-4c49-b114-07e5d506b255/manager/0.log" Dec 02 21:37:31 crc kubenswrapper[4807]: I1202 21:37:31.687370 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk_e0e35837-6389-4e86-b8c5-46105f1332cb/kube-rbac-proxy/0.log" Dec 02 21:37:31 crc kubenswrapper[4807]: I1202 21:37:31.744834 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4m6fvk_e0e35837-6389-4e86-b8c5-46105f1332cb/manager/0.log" Dec 02 21:37:31 crc kubenswrapper[4807]: I1202 21:37:31.972327 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:37:31 crc kubenswrapper[4807]: E1202 21:37:31.972597 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:37:31 crc kubenswrapper[4807]: I1202 21:37:31.991707 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-h8bl8_1f433a73-95ba-41cc-9f6e-3c6b26dd5e50/registry-server/0.log" Dec 02 21:37:32 crc kubenswrapper[4807]: I1202 21:37:32.046949 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-84f48485bd-tsr5l_9f824d2a-934d-4e25-95dd-6323a038f878/operator/0.log" Dec 02 21:37:32 crc kubenswrapper[4807]: I1202 21:37:32.226768 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-4qmqw_959355af-f6bd-492c-af58-9a7378224225/manager/0.log" Dec 02 21:37:32 crc kubenswrapper[4807]: I1202 21:37:32.361059 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-4qmqw_959355af-f6bd-492c-af58-9a7378224225/kube-rbac-proxy/0.log" Dec 02 21:37:32 crc kubenswrapper[4807]: I1202 21:37:32.399922 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-nh4hg_b2ef8498-337b-40c6-b122-19863c876321/kube-rbac-proxy/0.log" Dec 02 21:37:32 crc kubenswrapper[4807]: I1202 21:37:32.559429 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-nh4hg_b2ef8498-337b-40c6-b122-19863c876321/manager/0.log" Dec 02 21:37:32 crc kubenswrapper[4807]: I1202 21:37:32.630644 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-t8rj2_3035086c-2661-4720-97b1-df4d0cd891a6/operator/0.log" Dec 02 21:37:32 crc kubenswrapper[4807]: I1202 21:37:32.772298 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-7pqn2_b53c3dbf-2380-4dff-9b18-a2207efcce60/kube-rbac-proxy/0.log" Dec 02 21:37:32 crc kubenswrapper[4807]: I1202 21:37:32.829219 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-7pqn2_b53c3dbf-2380-4dff-9b18-a2207efcce60/manager/0.log" Dec 02 21:37:32 crc kubenswrapper[4807]: I1202 21:37:32.831747 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-57fb6dd487-lffvh_4039c119-ee84-4043-8892-733499aabdc5/manager/0.log" Dec 02 21:37:32 crc kubenswrapper[4807]: I1202 21:37:32.900313 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-tzhtf_8dd00692-0728-47cf-b8fa-ab812b11ec8f/kube-rbac-proxy/0.log" Dec 02 21:37:33 crc kubenswrapper[4807]: I1202 21:37:33.131214 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-tzhtf_8dd00692-0728-47cf-b8fa-ab812b11ec8f/manager/0.log" Dec 02 21:37:33 crc kubenswrapper[4807]: I1202 21:37:33.244417 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-qrthg_733f7038-d2b9-4047-8ee9-3ad49a55729d/kube-rbac-proxy/0.log" Dec 02 21:37:33 crc kubenswrapper[4807]: I1202 21:37:33.258216 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-qrthg_733f7038-d2b9-4047-8ee9-3ad49a55729d/manager/0.log" Dec 02 21:37:33 crc kubenswrapper[4807]: I1202 21:37:33.359443 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-58888ff59d-cwws5_2a376983-2f33-465c-9781-391b67941e21/kube-rbac-proxy/0.log" Dec 02 21:37:33 crc kubenswrapper[4807]: I1202 21:37:33.474628 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-58888ff59d-cwws5_2a376983-2f33-465c-9781-391b67941e21/manager/0.log" Dec 02 21:37:43 crc kubenswrapper[4807]: I1202 21:37:43.972869 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:37:43 crc kubenswrapper[4807]: E1202 21:37:43.975380 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:37:54 crc kubenswrapper[4807]: I1202 21:37:54.982829 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:37:54 crc kubenswrapper[4807]: E1202 21:37:54.983471 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:37:55 crc kubenswrapper[4807]: I1202 21:37:55.293246 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-n7sh5_617adf8b-8ca9-4578-8c24-8f6b22713567/control-plane-machine-set-operator/0.log" Dec 02 21:37:55 crc kubenswrapper[4807]: I1202 21:37:55.469169 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7c8mj_3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8/kube-rbac-proxy/0.log" Dec 02 21:37:55 crc kubenswrapper[4807]: I1202 21:37:55.480047 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7c8mj_3ad92eb0-17bb-4a51-8c3b-6c9fb121d1f8/machine-api-operator/0.log" Dec 02 21:38:08 crc kubenswrapper[4807]: I1202 21:38:08.973854 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:38:08 crc kubenswrapper[4807]: E1202 21:38:08.975123 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:38:09 crc kubenswrapper[4807]: I1202 21:38:09.082651 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-mx7pp_d47db1e8-80ed-44a6-9273-9d9fd2b05e33/cert-manager-controller/0.log" Dec 02 21:38:09 crc kubenswrapper[4807]: I1202 21:38:09.283166 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-wkdtr_2d1fe4d2-cc39-4c0c-a5e4-60366c119f94/cert-manager-cainjector/0.log" Dec 02 21:38:09 crc kubenswrapper[4807]: I1202 21:38:09.353141 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-fcc6m_1e90575d-2771-427e-a759-824575491965/cert-manager-webhook/0.log" Dec 02 21:38:22 crc kubenswrapper[4807]: I1202 21:38:22.976425 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:38:22 crc kubenswrapper[4807]: E1202 21:38:22.977183 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:38:23 crc kubenswrapper[4807]: I1202 21:38:23.908543 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-skgcp_37bb14e1-531b-43cf-b232-c11257dcf690/nmstate-console-plugin/0.log" Dec 02 21:38:24 crc kubenswrapper[4807]: I1202 21:38:24.117896 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-wd9cv_8b23693d-f1f9-4ae2-9558-44a4a25745bd/nmstate-handler/0.log" Dec 02 21:38:24 crc kubenswrapper[4807]: I1202 21:38:24.171742 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-6vnbp_01df7c9d-768d-417f-a7ed-7865655d889d/kube-rbac-proxy/0.log" Dec 02 21:38:24 crc kubenswrapper[4807]: I1202 21:38:24.178907 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-6vnbp_01df7c9d-768d-417f-a7ed-7865655d889d/nmstate-metrics/0.log" Dec 02 21:38:24 crc kubenswrapper[4807]: I1202 21:38:24.359246 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-tqh42_6f69b73b-75f4-4c02-a252-efd2ea50b022/nmstate-operator/0.log" Dec 02 21:38:24 crc kubenswrapper[4807]: I1202 21:38:24.387236 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-k8qmj_4a32ad98-5354-49e6-957e-ad0828445a24/nmstate-webhook/0.log" Dec 02 21:38:37 crc kubenswrapper[4807]: I1202 21:38:37.972997 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:38:37 crc kubenswrapper[4807]: E1202 21:38:37.973834 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:38:41 crc kubenswrapper[4807]: I1202 21:38:41.215594 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-qwrnk_af18f5b9-8057-49c0-b0b0-d64a7fff5357/kube-rbac-proxy/0.log" Dec 02 21:38:41 crc kubenswrapper[4807]: I1202 21:38:41.238778 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-qwrnk_af18f5b9-8057-49c0-b0b0-d64a7fff5357/controller/0.log" Dec 02 21:38:41 crc kubenswrapper[4807]: I1202 21:38:41.448264 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-frr-files/0.log" Dec 02 21:38:41 crc kubenswrapper[4807]: I1202 21:38:41.615456 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-reloader/0.log" Dec 02 21:38:41 crc kubenswrapper[4807]: I1202 21:38:41.626440 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-metrics/0.log" Dec 02 21:38:41 crc kubenswrapper[4807]: I1202 21:38:41.640075 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-frr-files/0.log" Dec 02 21:38:41 crc kubenswrapper[4807]: I1202 21:38:41.658773 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-reloader/0.log" Dec 02 21:38:41 crc kubenswrapper[4807]: I1202 21:38:41.856790 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-metrics/0.log" Dec 02 21:38:41 crc kubenswrapper[4807]: I1202 21:38:41.868750 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-frr-files/0.log" Dec 02 21:38:41 crc kubenswrapper[4807]: I1202 21:38:41.885066 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-reloader/0.log" Dec 02 21:38:41 crc kubenswrapper[4807]: I1202 21:38:41.910922 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-metrics/0.log" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.069666 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-reloader/0.log" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.088759 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-frr-files/0.log" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.094592 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/cp-metrics/0.log" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.124115 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/controller/0.log" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.152430 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lgnx2"] Dec 02 21:38:42 crc kubenswrapper[4807]: E1202 21:38:42.152859 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d25a24-60e7-40db-af9c-16b13eb5dfc3" containerName="extract-utilities" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.152876 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d25a24-60e7-40db-af9c-16b13eb5dfc3" containerName="extract-utilities" Dec 02 21:38:42 crc kubenswrapper[4807]: E1202 21:38:42.152905 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d25a24-60e7-40db-af9c-16b13eb5dfc3" containerName="registry-server" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.152912 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d25a24-60e7-40db-af9c-16b13eb5dfc3" containerName="registry-server" Dec 02 21:38:42 crc kubenswrapper[4807]: E1202 21:38:42.152933 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d25a24-60e7-40db-af9c-16b13eb5dfc3" containerName="extract-content" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.152938 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d25a24-60e7-40db-af9c-16b13eb5dfc3" containerName="extract-content" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.153282 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d25a24-60e7-40db-af9c-16b13eb5dfc3" containerName="registry-server" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.157558 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.166067 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lgnx2"] Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.348897 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-catalog-content\") pod \"redhat-operators-lgnx2\" (UID: \"0446dbf7-dbf6-4074-825d-47abbbbcb7ce\") " pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.348964 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv8mv\" (UniqueName: \"kubernetes.io/projected/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-kube-api-access-fv8mv\") pod \"redhat-operators-lgnx2\" (UID: \"0446dbf7-dbf6-4074-825d-47abbbbcb7ce\") " pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.349154 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-utilities\") pod \"redhat-operators-lgnx2\" (UID: \"0446dbf7-dbf6-4074-825d-47abbbbcb7ce\") " pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.350950 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/kube-rbac-proxy-frr/0.log" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.414842 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/frr-metrics/0.log" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.452696 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-utilities\") pod \"redhat-operators-lgnx2\" (UID: \"0446dbf7-dbf6-4074-825d-47abbbbcb7ce\") " pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.452902 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-catalog-content\") pod \"redhat-operators-lgnx2\" (UID: \"0446dbf7-dbf6-4074-825d-47abbbbcb7ce\") " pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.452945 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv8mv\" (UniqueName: \"kubernetes.io/projected/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-kube-api-access-fv8mv\") pod \"redhat-operators-lgnx2\" (UID: \"0446dbf7-dbf6-4074-825d-47abbbbcb7ce\") " pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.453256 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-utilities\") pod \"redhat-operators-lgnx2\" (UID: \"0446dbf7-dbf6-4074-825d-47abbbbcb7ce\") " pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.453864 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-catalog-content\") pod \"redhat-operators-lgnx2\" (UID: \"0446dbf7-dbf6-4074-825d-47abbbbcb7ce\") " pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.486901 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv8mv\" (UniqueName: \"kubernetes.io/projected/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-kube-api-access-fv8mv\") pod \"redhat-operators-lgnx2\" (UID: \"0446dbf7-dbf6-4074-825d-47abbbbcb7ce\") " pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.506345 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/kube-rbac-proxy/0.log" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.667428 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/reloader/0.log" Dec 02 21:38:42 crc kubenswrapper[4807]: I1202 21:38:42.778319 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:38:43 crc kubenswrapper[4807]: I1202 21:38:43.141882 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-4hldc_13a694d0-f9dd-46f3-84de-5b6c7b6a9e4f/frr-k8s-webhook-server/0.log" Dec 02 21:38:43 crc kubenswrapper[4807]: I1202 21:38:43.306964 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7684466ddc-qvkp7_a7267c32-ac50-4ee6-8766-f9e586c3bf39/manager/0.log" Dec 02 21:38:43 crc kubenswrapper[4807]: I1202 21:38:43.500398 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lgnx2"] Dec 02 21:38:43 crc kubenswrapper[4807]: I1202 21:38:43.639357 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgnx2" event={"ID":"0446dbf7-dbf6-4074-825d-47abbbbcb7ce","Type":"ContainerStarted","Data":"f61aaa55370e15c11f8db5a6c32995217c01470c9ad851bce52a159aa23c8b21"} Dec 02 21:38:43 crc kubenswrapper[4807]: I1202 21:38:43.641738 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-9c8db4b6b-74mvv_40eff5fb-df4c-47e1-bddf-ec09d648f511/webhook-server/0.log" Dec 02 21:38:43 crc kubenswrapper[4807]: I1202 21:38:43.838100 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vgclf_10f57a6c-ce50-4026-a330-b0a195528a92/kube-rbac-proxy/0.log" Dec 02 21:38:44 crc kubenswrapper[4807]: I1202 21:38:44.012696 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-d5xdx_ed377bf7-d0c1-45d0-bad2-948f4bde39aa/frr/0.log" Dec 02 21:38:44 crc kubenswrapper[4807]: I1202 21:38:44.351900 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vgclf_10f57a6c-ce50-4026-a330-b0a195528a92/speaker/0.log" Dec 02 21:38:44 crc kubenswrapper[4807]: I1202 21:38:44.649355 4807 generic.go:334] "Generic (PLEG): container finished" podID="0446dbf7-dbf6-4074-825d-47abbbbcb7ce" containerID="a9cc8e050ab13cc58b78bba0d641d427489401b569de779264b8b3bb0e76f37a" exitCode=0 Dec 02 21:38:44 crc kubenswrapper[4807]: I1202 21:38:44.649395 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgnx2" event={"ID":"0446dbf7-dbf6-4074-825d-47abbbbcb7ce","Type":"ContainerDied","Data":"a9cc8e050ab13cc58b78bba0d641d427489401b569de779264b8b3bb0e76f37a"} Dec 02 21:38:45 crc kubenswrapper[4807]: I1202 21:38:45.659435 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgnx2" event={"ID":"0446dbf7-dbf6-4074-825d-47abbbbcb7ce","Type":"ContainerStarted","Data":"045ef251a43d4058dfb494113960cc3de5b56ebefebeb0deaab0f5674305652f"} Dec 02 21:38:46 crc kubenswrapper[4807]: I1202 21:38:46.671869 4807 generic.go:334] "Generic (PLEG): container finished" podID="0446dbf7-dbf6-4074-825d-47abbbbcb7ce" containerID="045ef251a43d4058dfb494113960cc3de5b56ebefebeb0deaab0f5674305652f" exitCode=0 Dec 02 21:38:46 crc kubenswrapper[4807]: I1202 21:38:46.671997 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgnx2" event={"ID":"0446dbf7-dbf6-4074-825d-47abbbbcb7ce","Type":"ContainerDied","Data":"045ef251a43d4058dfb494113960cc3de5b56ebefebeb0deaab0f5674305652f"} Dec 02 21:38:47 crc kubenswrapper[4807]: I1202 21:38:47.684466 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgnx2" event={"ID":"0446dbf7-dbf6-4074-825d-47abbbbcb7ce","Type":"ContainerStarted","Data":"08b0608ecda55dc96ff4a300ea792f24825d946730ee9eddd950f584478578c8"} Dec 02 21:38:47 crc kubenswrapper[4807]: I1202 21:38:47.712512 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lgnx2" podStartSLOduration=3.194209769 podStartE2EDuration="5.712493477s" podCreationTimestamp="2025-12-02 21:38:42 +0000 UTC" firstStartedPulling="2025-12-02 21:38:44.65124823 +0000 UTC m=+6059.952155725" lastFinishedPulling="2025-12-02 21:38:47.169531938 +0000 UTC m=+6062.470439433" observedRunningTime="2025-12-02 21:38:47.702214845 +0000 UTC m=+6063.003122360" watchObservedRunningTime="2025-12-02 21:38:47.712493477 +0000 UTC m=+6063.013400972" Dec 02 21:38:50 crc kubenswrapper[4807]: I1202 21:38:50.973768 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:38:50 crc kubenswrapper[4807]: E1202 21:38:50.974410 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:38:52 crc kubenswrapper[4807]: I1202 21:38:52.779029 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:38:52 crc kubenswrapper[4807]: I1202 21:38:52.779130 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:38:53 crc kubenswrapper[4807]: I1202 21:38:53.866300 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lgnx2" podUID="0446dbf7-dbf6-4074-825d-47abbbbcb7ce" containerName="registry-server" probeResult="failure" output=< Dec 02 21:38:53 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Dec 02 21:38:53 crc kubenswrapper[4807]: > Dec 02 21:38:59 crc kubenswrapper[4807]: I1202 21:38:59.906567 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx_718520aa-df66-40e9-a10a-ac83475f1997/util/0.log" Dec 02 21:39:00 crc kubenswrapper[4807]: I1202 21:39:00.089996 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx_718520aa-df66-40e9-a10a-ac83475f1997/util/0.log" Dec 02 21:39:00 crc kubenswrapper[4807]: I1202 21:39:00.090968 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx_718520aa-df66-40e9-a10a-ac83475f1997/pull/0.log" Dec 02 21:39:00 crc kubenswrapper[4807]: I1202 21:39:00.127593 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx_718520aa-df66-40e9-a10a-ac83475f1997/pull/0.log" Dec 02 21:39:00 crc kubenswrapper[4807]: I1202 21:39:00.296305 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx_718520aa-df66-40e9-a10a-ac83475f1997/extract/0.log" Dec 02 21:39:00 crc kubenswrapper[4807]: I1202 21:39:00.312178 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx_718520aa-df66-40e9-a10a-ac83475f1997/pull/0.log" Dec 02 21:39:00 crc kubenswrapper[4807]: I1202 21:39:00.338377 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fd4xwx_718520aa-df66-40e9-a10a-ac83475f1997/util/0.log" Dec 02 21:39:00 crc kubenswrapper[4807]: I1202 21:39:00.500471 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc_25348ba1-760f-46a3-9f25-4054fb9ebed4/util/0.log" Dec 02 21:39:00 crc kubenswrapper[4807]: I1202 21:39:00.700497 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc_25348ba1-760f-46a3-9f25-4054fb9ebed4/pull/0.log" Dec 02 21:39:00 crc kubenswrapper[4807]: I1202 21:39:00.709106 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc_25348ba1-760f-46a3-9f25-4054fb9ebed4/util/0.log" Dec 02 21:39:00 crc kubenswrapper[4807]: I1202 21:39:00.768399 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc_25348ba1-760f-46a3-9f25-4054fb9ebed4/pull/0.log" Dec 02 21:39:00 crc kubenswrapper[4807]: I1202 21:39:00.946633 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc_25348ba1-760f-46a3-9f25-4054fb9ebed4/util/0.log" Dec 02 21:39:01 crc kubenswrapper[4807]: I1202 21:39:01.007171 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc_25348ba1-760f-46a3-9f25-4054fb9ebed4/extract/0.log" Dec 02 21:39:01 crc kubenswrapper[4807]: I1202 21:39:01.012032 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c921066lnc_25348ba1-760f-46a3-9f25-4054fb9ebed4/pull/0.log" Dec 02 21:39:01 crc kubenswrapper[4807]: I1202 21:39:01.142593 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g_9a9aee04-5b5a-4c8b-a0e8-b16ed522d829/util/0.log" Dec 02 21:39:01 crc kubenswrapper[4807]: I1202 21:39:01.320118 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g_9a9aee04-5b5a-4c8b-a0e8-b16ed522d829/pull/0.log" Dec 02 21:39:01 crc kubenswrapper[4807]: I1202 21:39:01.323993 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g_9a9aee04-5b5a-4c8b-a0e8-b16ed522d829/util/0.log" Dec 02 21:39:01 crc kubenswrapper[4807]: I1202 21:39:01.370094 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g_9a9aee04-5b5a-4c8b-a0e8-b16ed522d829/pull/0.log" Dec 02 21:39:01 crc kubenswrapper[4807]: I1202 21:39:01.600088 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g_9a9aee04-5b5a-4c8b-a0e8-b16ed522d829/extract/0.log" Dec 02 21:39:01 crc kubenswrapper[4807]: I1202 21:39:01.625419 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g_9a9aee04-5b5a-4c8b-a0e8-b16ed522d829/pull/0.log" Dec 02 21:39:01 crc kubenswrapper[4807]: I1202 21:39:01.646399 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f836w66g_9a9aee04-5b5a-4c8b-a0e8-b16ed522d829/util/0.log" Dec 02 21:39:01 crc kubenswrapper[4807]: I1202 21:39:01.783440 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qfvxm_c593c7c1-9bcc-4c52-92de-818b1cae7d51/extract-utilities/0.log" Dec 02 21:39:02 crc kubenswrapper[4807]: I1202 21:39:02.039639 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qfvxm_c593c7c1-9bcc-4c52-92de-818b1cae7d51/extract-utilities/0.log" Dec 02 21:39:02 crc kubenswrapper[4807]: I1202 21:39:02.061603 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qfvxm_c593c7c1-9bcc-4c52-92de-818b1cae7d51/extract-content/0.log" Dec 02 21:39:02 crc kubenswrapper[4807]: I1202 21:39:02.064552 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qfvxm_c593c7c1-9bcc-4c52-92de-818b1cae7d51/extract-content/0.log" Dec 02 21:39:02 crc kubenswrapper[4807]: I1202 21:39:02.219332 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qfvxm_c593c7c1-9bcc-4c52-92de-818b1cae7d51/extract-content/0.log" Dec 02 21:39:02 crc kubenswrapper[4807]: I1202 21:39:02.241909 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qfvxm_c593c7c1-9bcc-4c52-92de-818b1cae7d51/extract-utilities/0.log" Dec 02 21:39:02 crc kubenswrapper[4807]: I1202 21:39:02.424313 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khjdb_b1e5d8f8-0730-44b0-beb7-652e4b9461bd/extract-utilities/0.log" Dec 02 21:39:02 crc kubenswrapper[4807]: I1202 21:39:02.888454 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:39:02 crc kubenswrapper[4807]: I1202 21:39:02.906954 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khjdb_b1e5d8f8-0730-44b0-beb7-652e4b9461bd/extract-utilities/0.log" Dec 02 21:39:02 crc kubenswrapper[4807]: I1202 21:39:02.918819 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khjdb_b1e5d8f8-0730-44b0-beb7-652e4b9461bd/extract-content/0.log" Dec 02 21:39:02 crc kubenswrapper[4807]: I1202 21:39:02.926358 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khjdb_b1e5d8f8-0730-44b0-beb7-652e4b9461bd/extract-content/0.log" Dec 02 21:39:02 crc kubenswrapper[4807]: I1202 21:39:02.953450 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:39:02 crc kubenswrapper[4807]: I1202 21:39:02.985813 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:39:03 crc kubenswrapper[4807]: I1202 21:39:03.033963 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qfvxm_c593c7c1-9bcc-4c52-92de-818b1cae7d51/registry-server/0.log" Dec 02 21:39:03 crc kubenswrapper[4807]: I1202 21:39:03.143902 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lgnx2"] Dec 02 21:39:03 crc kubenswrapper[4807]: I1202 21:39:03.487142 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khjdb_b1e5d8f8-0730-44b0-beb7-652e4b9461bd/extract-content/0.log" Dec 02 21:39:03 crc kubenswrapper[4807]: I1202 21:39:03.489190 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khjdb_b1e5d8f8-0730-44b0-beb7-652e4b9461bd/extract-utilities/0.log" Dec 02 21:39:03 crc kubenswrapper[4807]: I1202 21:39:03.743612 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xd6tg_ffb3e245-8658-45b6-b784-250cd6d34a93/marketplace-operator/0.log" Dec 02 21:39:03 crc kubenswrapper[4807]: I1202 21:39:03.757208 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jl5sd_658f9914-a1d3-4c38-a8fd-ef69123b8f0a/extract-utilities/0.log" Dec 02 21:39:03 crc kubenswrapper[4807]: I1202 21:39:03.875168 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"7f87a6149111260dfeafef3b8b9a52a111cb2cb7329874a8edbc1ac25329e586"} Dec 02 21:39:04 crc kubenswrapper[4807]: I1202 21:39:04.099896 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jl5sd_658f9914-a1d3-4c38-a8fd-ef69123b8f0a/extract-content/0.log" Dec 02 21:39:04 crc kubenswrapper[4807]: I1202 21:39:04.103021 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jl5sd_658f9914-a1d3-4c38-a8fd-ef69123b8f0a/extract-utilities/0.log" Dec 02 21:39:04 crc kubenswrapper[4807]: I1202 21:39:04.131038 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jl5sd_658f9914-a1d3-4c38-a8fd-ef69123b8f0a/extract-content/0.log" Dec 02 21:39:04 crc kubenswrapper[4807]: I1202 21:39:04.283097 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jl5sd_658f9914-a1d3-4c38-a8fd-ef69123b8f0a/extract-utilities/0.log" Dec 02 21:39:04 crc kubenswrapper[4807]: I1202 21:39:04.306355 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khjdb_b1e5d8f8-0730-44b0-beb7-652e4b9461bd/registry-server/0.log" Dec 02 21:39:04 crc kubenswrapper[4807]: I1202 21:39:04.361580 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jl5sd_658f9914-a1d3-4c38-a8fd-ef69123b8f0a/extract-content/0.log" Dec 02 21:39:04 crc kubenswrapper[4807]: I1202 21:39:04.521463 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jl5sd_658f9914-a1d3-4c38-a8fd-ef69123b8f0a/registry-server/0.log" Dec 02 21:39:04 crc kubenswrapper[4807]: I1202 21:39:04.523903 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lgnx2_0446dbf7-dbf6-4074-825d-47abbbbcb7ce/extract-utilities/0.log" Dec 02 21:39:04 crc kubenswrapper[4807]: I1202 21:39:04.686844 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lgnx2_0446dbf7-dbf6-4074-825d-47abbbbcb7ce/extract-utilities/0.log" Dec 02 21:39:04 crc kubenswrapper[4807]: I1202 21:39:04.698150 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lgnx2_0446dbf7-dbf6-4074-825d-47abbbbcb7ce/extract-content/0.log" Dec 02 21:39:04 crc kubenswrapper[4807]: I1202 21:39:04.698213 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lgnx2_0446dbf7-dbf6-4074-825d-47abbbbcb7ce/extract-content/0.log" Dec 02 21:39:04 crc kubenswrapper[4807]: I1202 21:39:04.885705 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lgnx2" podUID="0446dbf7-dbf6-4074-825d-47abbbbcb7ce" containerName="registry-server" containerID="cri-o://08b0608ecda55dc96ff4a300ea792f24825d946730ee9eddd950f584478578c8" gracePeriod=2 Dec 02 21:39:04 crc kubenswrapper[4807]: I1202 21:39:04.900261 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lgnx2_0446dbf7-dbf6-4074-825d-47abbbbcb7ce/extract-content/0.log" Dec 02 21:39:04 crc kubenswrapper[4807]: I1202 21:39:04.929008 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lgnx2_0446dbf7-dbf6-4074-825d-47abbbbcb7ce/extract-utilities/0.log" Dec 02 21:39:04 crc kubenswrapper[4807]: I1202 21:39:04.958650 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vd6mt_13abe6c3-0a67-491f-abce-fc06c82b4707/extract-utilities/0.log" Dec 02 21:39:05 crc kubenswrapper[4807]: I1202 21:39:05.005137 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lgnx2_0446dbf7-dbf6-4074-825d-47abbbbcb7ce/registry-server/0.log" Dec 02 21:39:05 crc kubenswrapper[4807]: I1202 21:39:05.187306 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vd6mt_13abe6c3-0a67-491f-abce-fc06c82b4707/extract-content/0.log" Dec 02 21:39:05 crc kubenswrapper[4807]: I1202 21:39:05.193987 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vd6mt_13abe6c3-0a67-491f-abce-fc06c82b4707/extract-utilities/0.log" Dec 02 21:39:05 crc kubenswrapper[4807]: I1202 21:39:05.249974 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vd6mt_13abe6c3-0a67-491f-abce-fc06c82b4707/extract-content/0.log" Dec 02 21:39:05 crc kubenswrapper[4807]: I1202 21:39:05.396092 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vd6mt_13abe6c3-0a67-491f-abce-fc06c82b4707/extract-utilities/0.log" Dec 02 21:39:05 crc kubenswrapper[4807]: I1202 21:39:05.424844 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vd6mt_13abe6c3-0a67-491f-abce-fc06c82b4707/extract-content/0.log" Dec 02 21:39:05 crc kubenswrapper[4807]: I1202 21:39:05.737541 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vd6mt_13abe6c3-0a67-491f-abce-fc06c82b4707/registry-server/0.log" Dec 02 21:39:05 crc kubenswrapper[4807]: I1202 21:39:05.896669 4807 generic.go:334] "Generic (PLEG): container finished" podID="0446dbf7-dbf6-4074-825d-47abbbbcb7ce" containerID="08b0608ecda55dc96ff4a300ea792f24825d946730ee9eddd950f584478578c8" exitCode=0 Dec 02 21:39:05 crc kubenswrapper[4807]: I1202 21:39:05.896760 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgnx2" event={"ID":"0446dbf7-dbf6-4074-825d-47abbbbcb7ce","Type":"ContainerDied","Data":"08b0608ecda55dc96ff4a300ea792f24825d946730ee9eddd950f584478578c8"} Dec 02 21:39:05 crc kubenswrapper[4807]: I1202 21:39:05.897060 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgnx2" event={"ID":"0446dbf7-dbf6-4074-825d-47abbbbcb7ce","Type":"ContainerDied","Data":"f61aaa55370e15c11f8db5a6c32995217c01470c9ad851bce52a159aa23c8b21"} Dec 02 21:39:05 crc kubenswrapper[4807]: I1202 21:39:05.897080 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f61aaa55370e15c11f8db5a6c32995217c01470c9ad851bce52a159aa23c8b21" Dec 02 21:39:05 crc kubenswrapper[4807]: I1202 21:39:05.921593 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:39:06 crc kubenswrapper[4807]: I1202 21:39:06.108850 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv8mv\" (UniqueName: \"kubernetes.io/projected/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-kube-api-access-fv8mv\") pod \"0446dbf7-dbf6-4074-825d-47abbbbcb7ce\" (UID: \"0446dbf7-dbf6-4074-825d-47abbbbcb7ce\") " Dec 02 21:39:06 crc kubenswrapper[4807]: I1202 21:39:06.108954 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-utilities\") pod \"0446dbf7-dbf6-4074-825d-47abbbbcb7ce\" (UID: \"0446dbf7-dbf6-4074-825d-47abbbbcb7ce\") " Dec 02 21:39:06 crc kubenswrapper[4807]: I1202 21:39:06.109021 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-catalog-content\") pod \"0446dbf7-dbf6-4074-825d-47abbbbcb7ce\" (UID: \"0446dbf7-dbf6-4074-825d-47abbbbcb7ce\") " Dec 02 21:39:06 crc kubenswrapper[4807]: I1202 21:39:06.109750 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-utilities" (OuterVolumeSpecName: "utilities") pod "0446dbf7-dbf6-4074-825d-47abbbbcb7ce" (UID: "0446dbf7-dbf6-4074-825d-47abbbbcb7ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:39:06 crc kubenswrapper[4807]: I1202 21:39:06.117640 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-kube-api-access-fv8mv" (OuterVolumeSpecName: "kube-api-access-fv8mv") pod "0446dbf7-dbf6-4074-825d-47abbbbcb7ce" (UID: "0446dbf7-dbf6-4074-825d-47abbbbcb7ce"). InnerVolumeSpecName "kube-api-access-fv8mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:39:06 crc kubenswrapper[4807]: I1202 21:39:06.212268 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv8mv\" (UniqueName: \"kubernetes.io/projected/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-kube-api-access-fv8mv\") on node \"crc\" DevicePath \"\"" Dec 02 21:39:06 crc kubenswrapper[4807]: I1202 21:39:06.212322 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 21:39:06 crc kubenswrapper[4807]: I1202 21:39:06.230697 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0446dbf7-dbf6-4074-825d-47abbbbcb7ce" (UID: "0446dbf7-dbf6-4074-825d-47abbbbcb7ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:39:06 crc kubenswrapper[4807]: I1202 21:39:06.315439 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0446dbf7-dbf6-4074-825d-47abbbbcb7ce-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 21:39:06 crc kubenswrapper[4807]: I1202 21:39:06.905153 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lgnx2" Dec 02 21:39:06 crc kubenswrapper[4807]: I1202 21:39:06.954183 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lgnx2"] Dec 02 21:39:06 crc kubenswrapper[4807]: I1202 21:39:06.965508 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lgnx2"] Dec 02 21:39:06 crc kubenswrapper[4807]: I1202 21:39:06.988931 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0446dbf7-dbf6-4074-825d-47abbbbcb7ce" path="/var/lib/kubelet/pods/0446dbf7-dbf6-4074-825d-47abbbbcb7ce/volumes" Dec 02 21:39:20 crc kubenswrapper[4807]: I1202 21:39:20.648887 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-wv8s9_11697d44-d0d8-49be-ada1-7de7ab69950b/prometheus-operator/0.log" Dec 02 21:39:20 crc kubenswrapper[4807]: I1202 21:39:20.871261 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-fcbb9bc9c-94xzw_d45946cf-5cdf-4461-a0c0-c90b1367e919/prometheus-operator-admission-webhook/0.log" Dec 02 21:39:20 crc kubenswrapper[4807]: I1202 21:39:20.872195 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-fcbb9bc9c-9mxkr_a50d34d7-348c-4533-8b5b-8c5f3ee88af3/prometheus-operator-admission-webhook/0.log" Dec 02 21:39:21 crc kubenswrapper[4807]: I1202 21:39:21.228011 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-m69t7_818c8714-3224-4307-92c3-efc98ece9f1d/operator/0.log" Dec 02 21:39:21 crc kubenswrapper[4807]: I1202 21:39:21.243840 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-glnxl_6d0398c9-073c-437e-a5cf-e8abec984ebe/perses-operator/0.log" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.250138 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n7h7h"] Dec 02 21:39:28 crc kubenswrapper[4807]: E1202 21:39:28.251304 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0446dbf7-dbf6-4074-825d-47abbbbcb7ce" containerName="extract-utilities" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.251321 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0446dbf7-dbf6-4074-825d-47abbbbcb7ce" containerName="extract-utilities" Dec 02 21:39:28 crc kubenswrapper[4807]: E1202 21:39:28.251387 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0446dbf7-dbf6-4074-825d-47abbbbcb7ce" containerName="extract-content" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.251397 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0446dbf7-dbf6-4074-825d-47abbbbcb7ce" containerName="extract-content" Dec 02 21:39:28 crc kubenswrapper[4807]: E1202 21:39:28.251418 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0446dbf7-dbf6-4074-825d-47abbbbcb7ce" containerName="registry-server" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.251427 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0446dbf7-dbf6-4074-825d-47abbbbcb7ce" containerName="registry-server" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.251686 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="0446dbf7-dbf6-4074-825d-47abbbbcb7ce" containerName="registry-server" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.253595 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.284611 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7h7h"] Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.330869 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bd06548-f28f-4590-9e72-84e83dd0a4e8-catalog-content\") pod \"redhat-marketplace-n7h7h\" (UID: \"4bd06548-f28f-4590-9e72-84e83dd0a4e8\") " pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.331312 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bd06548-f28f-4590-9e72-84e83dd0a4e8-utilities\") pod \"redhat-marketplace-n7h7h\" (UID: \"4bd06548-f28f-4590-9e72-84e83dd0a4e8\") " pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.331366 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gfsf\" (UniqueName: \"kubernetes.io/projected/4bd06548-f28f-4590-9e72-84e83dd0a4e8-kube-api-access-2gfsf\") pod \"redhat-marketplace-n7h7h\" (UID: \"4bd06548-f28f-4590-9e72-84e83dd0a4e8\") " pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.433344 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bd06548-f28f-4590-9e72-84e83dd0a4e8-utilities\") pod \"redhat-marketplace-n7h7h\" (UID: \"4bd06548-f28f-4590-9e72-84e83dd0a4e8\") " pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.433386 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gfsf\" (UniqueName: \"kubernetes.io/projected/4bd06548-f28f-4590-9e72-84e83dd0a4e8-kube-api-access-2gfsf\") pod \"redhat-marketplace-n7h7h\" (UID: \"4bd06548-f28f-4590-9e72-84e83dd0a4e8\") " pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.433440 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bd06548-f28f-4590-9e72-84e83dd0a4e8-catalog-content\") pod \"redhat-marketplace-n7h7h\" (UID: \"4bd06548-f28f-4590-9e72-84e83dd0a4e8\") " pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.433911 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bd06548-f28f-4590-9e72-84e83dd0a4e8-utilities\") pod \"redhat-marketplace-n7h7h\" (UID: \"4bd06548-f28f-4590-9e72-84e83dd0a4e8\") " pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.434352 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bd06548-f28f-4590-9e72-84e83dd0a4e8-catalog-content\") pod \"redhat-marketplace-n7h7h\" (UID: \"4bd06548-f28f-4590-9e72-84e83dd0a4e8\") " pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.470579 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gfsf\" (UniqueName: \"kubernetes.io/projected/4bd06548-f28f-4590-9e72-84e83dd0a4e8-kube-api-access-2gfsf\") pod \"redhat-marketplace-n7h7h\" (UID: \"4bd06548-f28f-4590-9e72-84e83dd0a4e8\") " pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.571309 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:28 crc kubenswrapper[4807]: I1202 21:39:28.865995 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7h7h"] Dec 02 21:39:29 crc kubenswrapper[4807]: I1202 21:39:29.167877 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7h7h" event={"ID":"4bd06548-f28f-4590-9e72-84e83dd0a4e8","Type":"ContainerStarted","Data":"0f5f64d477e18dcf5783ee8ab67062d91fb537be50414d6e810a13dbff375236"} Dec 02 21:39:30 crc kubenswrapper[4807]: I1202 21:39:30.176927 4807 generic.go:334] "Generic (PLEG): container finished" podID="4bd06548-f28f-4590-9e72-84e83dd0a4e8" containerID="80099c119ebc35c0c0d85b5123b8feb890f7d8dc0874cdfc15dbe5a7b050eb16" exitCode=0 Dec 02 21:39:30 crc kubenswrapper[4807]: I1202 21:39:30.177295 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7h7h" event={"ID":"4bd06548-f28f-4590-9e72-84e83dd0a4e8","Type":"ContainerDied","Data":"80099c119ebc35c0c0d85b5123b8feb890f7d8dc0874cdfc15dbe5a7b050eb16"} Dec 02 21:39:32 crc kubenswrapper[4807]: I1202 21:39:32.198191 4807 generic.go:334] "Generic (PLEG): container finished" podID="4bd06548-f28f-4590-9e72-84e83dd0a4e8" containerID="de549904b416b2ebe1fac83cc6a6a5fc1fb58903a05fb58cd3417050ed95fcfb" exitCode=0 Dec 02 21:39:32 crc kubenswrapper[4807]: I1202 21:39:32.198911 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7h7h" event={"ID":"4bd06548-f28f-4590-9e72-84e83dd0a4e8","Type":"ContainerDied","Data":"de549904b416b2ebe1fac83cc6a6a5fc1fb58903a05fb58cd3417050ed95fcfb"} Dec 02 21:39:33 crc kubenswrapper[4807]: I1202 21:39:33.215685 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7h7h" event={"ID":"4bd06548-f28f-4590-9e72-84e83dd0a4e8","Type":"ContainerStarted","Data":"c26d1a4f4d846e374e7a77dba2b8f8337ee3b794db7e74dedaa71635a31f7bf6"} Dec 02 21:39:38 crc kubenswrapper[4807]: I1202 21:39:38.571990 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:38 crc kubenswrapper[4807]: I1202 21:39:38.572481 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:38 crc kubenswrapper[4807]: I1202 21:39:38.624317 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:38 crc kubenswrapper[4807]: I1202 21:39:38.655950 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n7h7h" podStartSLOduration=8.037248197 podStartE2EDuration="10.655922673s" podCreationTimestamp="2025-12-02 21:39:28 +0000 UTC" firstStartedPulling="2025-12-02 21:39:30.178302918 +0000 UTC m=+6105.479210413" lastFinishedPulling="2025-12-02 21:39:32.796977394 +0000 UTC m=+6108.097884889" observedRunningTime="2025-12-02 21:39:33.237872868 +0000 UTC m=+6108.538780363" watchObservedRunningTime="2025-12-02 21:39:38.655922673 +0000 UTC m=+6113.956830168" Dec 02 21:39:39 crc kubenswrapper[4807]: I1202 21:39:39.333246 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:39 crc kubenswrapper[4807]: I1202 21:39:39.382550 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7h7h"] Dec 02 21:39:41 crc kubenswrapper[4807]: I1202 21:39:41.304955 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n7h7h" podUID="4bd06548-f28f-4590-9e72-84e83dd0a4e8" containerName="registry-server" containerID="cri-o://c26d1a4f4d846e374e7a77dba2b8f8337ee3b794db7e74dedaa71635a31f7bf6" gracePeriod=2 Dec 02 21:39:41 crc kubenswrapper[4807]: I1202 21:39:41.797969 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:41 crc kubenswrapper[4807]: I1202 21:39:41.876636 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bd06548-f28f-4590-9e72-84e83dd0a4e8-catalog-content\") pod \"4bd06548-f28f-4590-9e72-84e83dd0a4e8\" (UID: \"4bd06548-f28f-4590-9e72-84e83dd0a4e8\") " Dec 02 21:39:41 crc kubenswrapper[4807]: I1202 21:39:41.876736 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bd06548-f28f-4590-9e72-84e83dd0a4e8-utilities\") pod \"4bd06548-f28f-4590-9e72-84e83dd0a4e8\" (UID: \"4bd06548-f28f-4590-9e72-84e83dd0a4e8\") " Dec 02 21:39:41 crc kubenswrapper[4807]: I1202 21:39:41.876811 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gfsf\" (UniqueName: \"kubernetes.io/projected/4bd06548-f28f-4590-9e72-84e83dd0a4e8-kube-api-access-2gfsf\") pod \"4bd06548-f28f-4590-9e72-84e83dd0a4e8\" (UID: \"4bd06548-f28f-4590-9e72-84e83dd0a4e8\") " Dec 02 21:39:41 crc kubenswrapper[4807]: I1202 21:39:41.877533 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bd06548-f28f-4590-9e72-84e83dd0a4e8-utilities" (OuterVolumeSpecName: "utilities") pod "4bd06548-f28f-4590-9e72-84e83dd0a4e8" (UID: "4bd06548-f28f-4590-9e72-84e83dd0a4e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:39:41 crc kubenswrapper[4807]: I1202 21:39:41.893008 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bd06548-f28f-4590-9e72-84e83dd0a4e8-kube-api-access-2gfsf" (OuterVolumeSpecName: "kube-api-access-2gfsf") pod "4bd06548-f28f-4590-9e72-84e83dd0a4e8" (UID: "4bd06548-f28f-4590-9e72-84e83dd0a4e8"). InnerVolumeSpecName "kube-api-access-2gfsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:39:41 crc kubenswrapper[4807]: I1202 21:39:41.893572 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bd06548-f28f-4590-9e72-84e83dd0a4e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bd06548-f28f-4590-9e72-84e83dd0a4e8" (UID: "4bd06548-f28f-4590-9e72-84e83dd0a4e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:39:41 crc kubenswrapper[4807]: I1202 21:39:41.979125 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bd06548-f28f-4590-9e72-84e83dd0a4e8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 21:39:41 crc kubenswrapper[4807]: I1202 21:39:41.979163 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bd06548-f28f-4590-9e72-84e83dd0a4e8-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 21:39:41 crc kubenswrapper[4807]: I1202 21:39:41.979176 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gfsf\" (UniqueName: \"kubernetes.io/projected/4bd06548-f28f-4590-9e72-84e83dd0a4e8-kube-api-access-2gfsf\") on node \"crc\" DevicePath \"\"" Dec 02 21:39:42 crc kubenswrapper[4807]: I1202 21:39:42.317934 4807 generic.go:334] "Generic (PLEG): container finished" podID="4bd06548-f28f-4590-9e72-84e83dd0a4e8" containerID="c26d1a4f4d846e374e7a77dba2b8f8337ee3b794db7e74dedaa71635a31f7bf6" exitCode=0 Dec 02 21:39:42 crc kubenswrapper[4807]: I1202 21:39:42.318074 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7h7h" event={"ID":"4bd06548-f28f-4590-9e72-84e83dd0a4e8","Type":"ContainerDied","Data":"c26d1a4f4d846e374e7a77dba2b8f8337ee3b794db7e74dedaa71635a31f7bf6"} Dec 02 21:39:42 crc kubenswrapper[4807]: I1202 21:39:42.318163 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7h7h" event={"ID":"4bd06548-f28f-4590-9e72-84e83dd0a4e8","Type":"ContainerDied","Data":"0f5f64d477e18dcf5783ee8ab67062d91fb537be50414d6e810a13dbff375236"} Dec 02 21:39:42 crc kubenswrapper[4807]: I1202 21:39:42.318193 4807 scope.go:117] "RemoveContainer" containerID="c26d1a4f4d846e374e7a77dba2b8f8337ee3b794db7e74dedaa71635a31f7bf6" Dec 02 21:39:42 crc kubenswrapper[4807]: I1202 21:39:42.318103 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7h7h" Dec 02 21:39:42 crc kubenswrapper[4807]: I1202 21:39:42.356235 4807 scope.go:117] "RemoveContainer" containerID="de549904b416b2ebe1fac83cc6a6a5fc1fb58903a05fb58cd3417050ed95fcfb" Dec 02 21:39:42 crc kubenswrapper[4807]: I1202 21:39:42.356459 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7h7h"] Dec 02 21:39:42 crc kubenswrapper[4807]: I1202 21:39:42.367779 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7h7h"] Dec 02 21:39:42 crc kubenswrapper[4807]: I1202 21:39:42.380094 4807 scope.go:117] "RemoveContainer" containerID="80099c119ebc35c0c0d85b5123b8feb890f7d8dc0874cdfc15dbe5a7b050eb16" Dec 02 21:39:42 crc kubenswrapper[4807]: I1202 21:39:42.438416 4807 scope.go:117] "RemoveContainer" containerID="c26d1a4f4d846e374e7a77dba2b8f8337ee3b794db7e74dedaa71635a31f7bf6" Dec 02 21:39:42 crc kubenswrapper[4807]: E1202 21:39:42.439020 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c26d1a4f4d846e374e7a77dba2b8f8337ee3b794db7e74dedaa71635a31f7bf6\": container with ID starting with c26d1a4f4d846e374e7a77dba2b8f8337ee3b794db7e74dedaa71635a31f7bf6 not found: ID does not exist" containerID="c26d1a4f4d846e374e7a77dba2b8f8337ee3b794db7e74dedaa71635a31f7bf6" Dec 02 21:39:42 crc kubenswrapper[4807]: I1202 21:39:42.439147 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c26d1a4f4d846e374e7a77dba2b8f8337ee3b794db7e74dedaa71635a31f7bf6"} err="failed to get container status \"c26d1a4f4d846e374e7a77dba2b8f8337ee3b794db7e74dedaa71635a31f7bf6\": rpc error: code = NotFound desc = could not find container \"c26d1a4f4d846e374e7a77dba2b8f8337ee3b794db7e74dedaa71635a31f7bf6\": container with ID starting with c26d1a4f4d846e374e7a77dba2b8f8337ee3b794db7e74dedaa71635a31f7bf6 not found: ID does not exist" Dec 02 21:39:42 crc kubenswrapper[4807]: I1202 21:39:42.439233 4807 scope.go:117] "RemoveContainer" containerID="de549904b416b2ebe1fac83cc6a6a5fc1fb58903a05fb58cd3417050ed95fcfb" Dec 02 21:39:42 crc kubenswrapper[4807]: E1202 21:39:42.439841 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de549904b416b2ebe1fac83cc6a6a5fc1fb58903a05fb58cd3417050ed95fcfb\": container with ID starting with de549904b416b2ebe1fac83cc6a6a5fc1fb58903a05fb58cd3417050ed95fcfb not found: ID does not exist" containerID="de549904b416b2ebe1fac83cc6a6a5fc1fb58903a05fb58cd3417050ed95fcfb" Dec 02 21:39:42 crc kubenswrapper[4807]: I1202 21:39:42.439913 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de549904b416b2ebe1fac83cc6a6a5fc1fb58903a05fb58cd3417050ed95fcfb"} err="failed to get container status \"de549904b416b2ebe1fac83cc6a6a5fc1fb58903a05fb58cd3417050ed95fcfb\": rpc error: code = NotFound desc = could not find container \"de549904b416b2ebe1fac83cc6a6a5fc1fb58903a05fb58cd3417050ed95fcfb\": container with ID starting with de549904b416b2ebe1fac83cc6a6a5fc1fb58903a05fb58cd3417050ed95fcfb not found: ID does not exist" Dec 02 21:39:42 crc kubenswrapper[4807]: I1202 21:39:42.439947 4807 scope.go:117] "RemoveContainer" containerID="80099c119ebc35c0c0d85b5123b8feb890f7d8dc0874cdfc15dbe5a7b050eb16" Dec 02 21:39:42 crc kubenswrapper[4807]: E1202 21:39:42.440915 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80099c119ebc35c0c0d85b5123b8feb890f7d8dc0874cdfc15dbe5a7b050eb16\": container with ID starting with 80099c119ebc35c0c0d85b5123b8feb890f7d8dc0874cdfc15dbe5a7b050eb16 not found: ID does not exist" containerID="80099c119ebc35c0c0d85b5123b8feb890f7d8dc0874cdfc15dbe5a7b050eb16" Dec 02 21:39:42 crc kubenswrapper[4807]: I1202 21:39:42.440955 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80099c119ebc35c0c0d85b5123b8feb890f7d8dc0874cdfc15dbe5a7b050eb16"} err="failed to get container status \"80099c119ebc35c0c0d85b5123b8feb890f7d8dc0874cdfc15dbe5a7b050eb16\": rpc error: code = NotFound desc = could not find container \"80099c119ebc35c0c0d85b5123b8feb890f7d8dc0874cdfc15dbe5a7b050eb16\": container with ID starting with 80099c119ebc35c0c0d85b5123b8feb890f7d8dc0874cdfc15dbe5a7b050eb16 not found: ID does not exist" Dec 02 21:39:42 crc kubenswrapper[4807]: I1202 21:39:42.982981 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bd06548-f28f-4590-9e72-84e83dd0a4e8" path="/var/lib/kubelet/pods/4bd06548-f28f-4590-9e72-84e83dd0a4e8/volumes" Dec 02 21:40:53 crc kubenswrapper[4807]: I1202 21:40:53.702297 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-9c96f4455-bvlsr" podUID="60cf7565-bc2c-469d-a0ad-400e95d69528" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 02 21:41:10 crc kubenswrapper[4807]: I1202 21:41:10.417519 4807 generic.go:334] "Generic (PLEG): container finished" podID="2c806d6b-600e-4ee3-8e8b-cbabb5ddb165" containerID="e1ddf88e5a8a1cb1223500bd22fdd9599f635bd675451afebb109a9067548a25" exitCode=0 Dec 02 21:41:10 crc kubenswrapper[4807]: I1202 21:41:10.417639 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r596x/must-gather-9cmtg" event={"ID":"2c806d6b-600e-4ee3-8e8b-cbabb5ddb165","Type":"ContainerDied","Data":"e1ddf88e5a8a1cb1223500bd22fdd9599f635bd675451afebb109a9067548a25"} Dec 02 21:41:10 crc kubenswrapper[4807]: I1202 21:41:10.418850 4807 scope.go:117] "RemoveContainer" containerID="e1ddf88e5a8a1cb1223500bd22fdd9599f635bd675451afebb109a9067548a25" Dec 02 21:41:10 crc kubenswrapper[4807]: I1202 21:41:10.996538 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r596x_must-gather-9cmtg_2c806d6b-600e-4ee3-8e8b-cbabb5ddb165/gather/0.log" Dec 02 21:41:24 crc kubenswrapper[4807]: I1202 21:41:24.299615 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r596x/must-gather-9cmtg"] Dec 02 21:41:24 crc kubenswrapper[4807]: I1202 21:41:24.300533 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-r596x/must-gather-9cmtg" podUID="2c806d6b-600e-4ee3-8e8b-cbabb5ddb165" containerName="copy" containerID="cri-o://15127fbaf633f9398b6486d4a1d3d48d56b1cccc96e38e02c882c2189c529ada" gracePeriod=2 Dec 02 21:41:24 crc kubenswrapper[4807]: I1202 21:41:24.312955 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r596x/must-gather-9cmtg"] Dec 02 21:41:24 crc kubenswrapper[4807]: I1202 21:41:24.582598 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r596x_must-gather-9cmtg_2c806d6b-600e-4ee3-8e8b-cbabb5ddb165/copy/0.log" Dec 02 21:41:24 crc kubenswrapper[4807]: I1202 21:41:24.583146 4807 generic.go:334] "Generic (PLEG): container finished" podID="2c806d6b-600e-4ee3-8e8b-cbabb5ddb165" containerID="15127fbaf633f9398b6486d4a1d3d48d56b1cccc96e38e02c882c2189c529ada" exitCode=143 Dec 02 21:41:24 crc kubenswrapper[4807]: I1202 21:41:24.827856 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r596x_must-gather-9cmtg_2c806d6b-600e-4ee3-8e8b-cbabb5ddb165/copy/0.log" Dec 02 21:41:24 crc kubenswrapper[4807]: I1202 21:41:24.828650 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r596x/must-gather-9cmtg" Dec 02 21:41:24 crc kubenswrapper[4807]: I1202 21:41:24.964219 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c806d6b-600e-4ee3-8e8b-cbabb5ddb165-must-gather-output\") pod \"2c806d6b-600e-4ee3-8e8b-cbabb5ddb165\" (UID: \"2c806d6b-600e-4ee3-8e8b-cbabb5ddb165\") " Dec 02 21:41:24 crc kubenswrapper[4807]: I1202 21:41:24.964365 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7chbm\" (UniqueName: \"kubernetes.io/projected/2c806d6b-600e-4ee3-8e8b-cbabb5ddb165-kube-api-access-7chbm\") pod \"2c806d6b-600e-4ee3-8e8b-cbabb5ddb165\" (UID: \"2c806d6b-600e-4ee3-8e8b-cbabb5ddb165\") " Dec 02 21:41:24 crc kubenswrapper[4807]: I1202 21:41:24.972975 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c806d6b-600e-4ee3-8e8b-cbabb5ddb165-kube-api-access-7chbm" (OuterVolumeSpecName: "kube-api-access-7chbm") pod "2c806d6b-600e-4ee3-8e8b-cbabb5ddb165" (UID: "2c806d6b-600e-4ee3-8e8b-cbabb5ddb165"). InnerVolumeSpecName "kube-api-access-7chbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:41:25 crc kubenswrapper[4807]: I1202 21:41:25.070627 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7chbm\" (UniqueName: \"kubernetes.io/projected/2c806d6b-600e-4ee3-8e8b-cbabb5ddb165-kube-api-access-7chbm\") on node \"crc\" DevicePath \"\"" Dec 02 21:41:25 crc kubenswrapper[4807]: I1202 21:41:25.144679 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c806d6b-600e-4ee3-8e8b-cbabb5ddb165-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2c806d6b-600e-4ee3-8e8b-cbabb5ddb165" (UID: "2c806d6b-600e-4ee3-8e8b-cbabb5ddb165"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 21:41:25 crc kubenswrapper[4807]: I1202 21:41:25.172766 4807 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2c806d6b-600e-4ee3-8e8b-cbabb5ddb165-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 02 21:41:25 crc kubenswrapper[4807]: I1202 21:41:25.594176 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r596x_must-gather-9cmtg_2c806d6b-600e-4ee3-8e8b-cbabb5ddb165/copy/0.log" Dec 02 21:41:25 crc kubenswrapper[4807]: I1202 21:41:25.594462 4807 scope.go:117] "RemoveContainer" containerID="15127fbaf633f9398b6486d4a1d3d48d56b1cccc96e38e02c882c2189c529ada" Dec 02 21:41:25 crc kubenswrapper[4807]: I1202 21:41:25.594629 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r596x/must-gather-9cmtg" Dec 02 21:41:25 crc kubenswrapper[4807]: I1202 21:41:25.643898 4807 scope.go:117] "RemoveContainer" containerID="e1ddf88e5a8a1cb1223500bd22fdd9599f635bd675451afebb109a9067548a25" Dec 02 21:41:26 crc kubenswrapper[4807]: I1202 21:41:26.984227 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c806d6b-600e-4ee3-8e8b-cbabb5ddb165" path="/var/lib/kubelet/pods/2c806d6b-600e-4ee3-8e8b-cbabb5ddb165/volumes" Dec 02 21:41:28 crc kubenswrapper[4807]: I1202 21:41:28.292426 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:41:28 crc kubenswrapper[4807]: I1202 21:41:28.292741 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:41:58 crc kubenswrapper[4807]: I1202 21:41:58.292418 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:41:58 crc kubenswrapper[4807]: I1202 21:41:58.292825 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:42:14 crc kubenswrapper[4807]: I1202 21:42:14.369185 4807 scope.go:117] "RemoveContainer" containerID="dcb70040b795ca1fd6511702dc1289bc462e39379c472d1c97d9f57daee1808b" Dec 02 21:42:14 crc kubenswrapper[4807]: I1202 21:42:14.408128 4807 scope.go:117] "RemoveContainer" containerID="ae76063a4f826518852c367341478e5f52996a19842b1b905fec60f81ec2c73d" Dec 02 21:42:28 crc kubenswrapper[4807]: I1202 21:42:28.292695 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:42:28 crc kubenswrapper[4807]: I1202 21:42:28.293340 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:42:28 crc kubenswrapper[4807]: I1202 21:42:28.293412 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 21:42:28 crc kubenswrapper[4807]: I1202 21:42:28.294609 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7f87a6149111260dfeafef3b8b9a52a111cb2cb7329874a8edbc1ac25329e586"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 21:42:28 crc kubenswrapper[4807]: I1202 21:42:28.294741 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://7f87a6149111260dfeafef3b8b9a52a111cb2cb7329874a8edbc1ac25329e586" gracePeriod=600 Dec 02 21:42:28 crc kubenswrapper[4807]: I1202 21:42:28.730699 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="7f87a6149111260dfeafef3b8b9a52a111cb2cb7329874a8edbc1ac25329e586" exitCode=0 Dec 02 21:42:28 crc kubenswrapper[4807]: I1202 21:42:28.730833 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"7f87a6149111260dfeafef3b8b9a52a111cb2cb7329874a8edbc1ac25329e586"} Dec 02 21:42:28 crc kubenswrapper[4807]: I1202 21:42:28.731184 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerStarted","Data":"85d5e6ef4dc22acb7e295b39bad89145375495d24356a92707bcc342b9f67488"} Dec 02 21:42:28 crc kubenswrapper[4807]: I1202 21:42:28.731221 4807 scope.go:117] "RemoveContainer" containerID="1736d1de2fb38756e680e1769c1a83262f8a4f00f964564a08912265ec907d96" Dec 02 21:44:28 crc kubenswrapper[4807]: I1202 21:44:28.293272 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:44:28 crc kubenswrapper[4807]: I1202 21:44:28.293901 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:44:58 crc kubenswrapper[4807]: I1202 21:44:58.293149 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:44:58 crc kubenswrapper[4807]: I1202 21:44:58.293828 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.193190 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8"] Dec 02 21:45:00 crc kubenswrapper[4807]: E1202 21:45:00.194482 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c806d6b-600e-4ee3-8e8b-cbabb5ddb165" containerName="copy" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.194563 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c806d6b-600e-4ee3-8e8b-cbabb5ddb165" containerName="copy" Dec 02 21:45:00 crc kubenswrapper[4807]: E1202 21:45:00.194625 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd06548-f28f-4590-9e72-84e83dd0a4e8" containerName="registry-server" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.194687 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd06548-f28f-4590-9e72-84e83dd0a4e8" containerName="registry-server" Dec 02 21:45:00 crc kubenswrapper[4807]: E1202 21:45:00.194777 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c806d6b-600e-4ee3-8e8b-cbabb5ddb165" containerName="gather" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.194843 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c806d6b-600e-4ee3-8e8b-cbabb5ddb165" containerName="gather" Dec 02 21:45:00 crc kubenswrapper[4807]: E1202 21:45:00.194909 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd06548-f28f-4590-9e72-84e83dd0a4e8" containerName="extract-utilities" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.194976 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd06548-f28f-4590-9e72-84e83dd0a4e8" containerName="extract-utilities" Dec 02 21:45:00 crc kubenswrapper[4807]: E1202 21:45:00.195045 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd06548-f28f-4590-9e72-84e83dd0a4e8" containerName="extract-content" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.195102 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd06548-f28f-4590-9e72-84e83dd0a4e8" containerName="extract-content" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.195355 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c806d6b-600e-4ee3-8e8b-cbabb5ddb165" containerName="copy" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.195428 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c806d6b-600e-4ee3-8e8b-cbabb5ddb165" containerName="gather" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.195509 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd06548-f28f-4590-9e72-84e83dd0a4e8" containerName="registry-server" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.198455 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.200242 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.201235 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.208130 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8"] Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.268300 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-secret-volume\") pod \"collect-profiles-29411865-7q5j8\" (UID: \"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.268616 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-config-volume\") pod \"collect-profiles-29411865-7q5j8\" (UID: \"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.268792 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m5cc\" (UniqueName: \"kubernetes.io/projected/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-kube-api-access-2m5cc\") pod \"collect-profiles-29411865-7q5j8\" (UID: \"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.372578 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m5cc\" (UniqueName: \"kubernetes.io/projected/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-kube-api-access-2m5cc\") pod \"collect-profiles-29411865-7q5j8\" (UID: \"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.372727 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-secret-volume\") pod \"collect-profiles-29411865-7q5j8\" (UID: \"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.372815 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-config-volume\") pod \"collect-profiles-29411865-7q5j8\" (UID: \"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.373814 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-config-volume\") pod \"collect-profiles-29411865-7q5j8\" (UID: \"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.379137 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-secret-volume\") pod \"collect-profiles-29411865-7q5j8\" (UID: \"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.388868 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m5cc\" (UniqueName: \"kubernetes.io/projected/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-kube-api-access-2m5cc\") pod \"collect-profiles-29411865-7q5j8\" (UID: \"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8" Dec 02 21:45:00 crc kubenswrapper[4807]: I1202 21:45:00.542431 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8" Dec 02 21:45:01 crc kubenswrapper[4807]: I1202 21:45:01.033141 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8"] Dec 02 21:45:01 crc kubenswrapper[4807]: I1202 21:45:01.698220 4807 generic.go:334] "Generic (PLEG): container finished" podID="13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8" containerID="d53f50e70adba979f996895da355b35b8c6aaa71420ce81565be19dcac75600b" exitCode=0 Dec 02 21:45:01 crc kubenswrapper[4807]: I1202 21:45:01.698440 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8" event={"ID":"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8","Type":"ContainerDied","Data":"d53f50e70adba979f996895da355b35b8c6aaa71420ce81565be19dcac75600b"} Dec 02 21:45:01 crc kubenswrapper[4807]: I1202 21:45:01.698519 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8" event={"ID":"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8","Type":"ContainerStarted","Data":"1ccc885b893d87abf6916d32c1de02ee530f53152ed109931b580a5c8fa432e0"} Dec 02 21:45:03 crc kubenswrapper[4807]: I1202 21:45:03.088415 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8" Dec 02 21:45:03 crc kubenswrapper[4807]: I1202 21:45:03.237464 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-secret-volume\") pod \"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8\" (UID: \"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8\") " Dec 02 21:45:03 crc kubenswrapper[4807]: I1202 21:45:03.237826 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m5cc\" (UniqueName: \"kubernetes.io/projected/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-kube-api-access-2m5cc\") pod \"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8\" (UID: \"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8\") " Dec 02 21:45:03 crc kubenswrapper[4807]: I1202 21:45:03.237879 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-config-volume\") pod \"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8\" (UID: \"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8\") " Dec 02 21:45:03 crc kubenswrapper[4807]: I1202 21:45:03.239343 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-config-volume" (OuterVolumeSpecName: "config-volume") pod "13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8" (UID: "13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 21:45:03 crc kubenswrapper[4807]: I1202 21:45:03.244675 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8" (UID: "13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 21:45:03 crc kubenswrapper[4807]: I1202 21:45:03.250898 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-kube-api-access-2m5cc" (OuterVolumeSpecName: "kube-api-access-2m5cc") pod "13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8" (UID: "13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8"). InnerVolumeSpecName "kube-api-access-2m5cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 21:45:03 crc kubenswrapper[4807]: I1202 21:45:03.340275 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m5cc\" (UniqueName: \"kubernetes.io/projected/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-kube-api-access-2m5cc\") on node \"crc\" DevicePath \"\"" Dec 02 21:45:03 crc kubenswrapper[4807]: I1202 21:45:03.340313 4807 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 21:45:03 crc kubenswrapper[4807]: I1202 21:45:03.340324 4807 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 21:45:03 crc kubenswrapper[4807]: I1202 21:45:03.723152 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8" event={"ID":"13fdc39c-fe0b-41a8-8286-55e0fa5eb4d8","Type":"ContainerDied","Data":"1ccc885b893d87abf6916d32c1de02ee530f53152ed109931b580a5c8fa432e0"} Dec 02 21:45:03 crc kubenswrapper[4807]: I1202 21:45:03.723388 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ccc885b893d87abf6916d32c1de02ee530f53152ed109931b580a5c8fa432e0" Dec 02 21:45:03 crc kubenswrapper[4807]: I1202 21:45:03.723220 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411865-7q5j8" Dec 02 21:45:03 crc kubenswrapper[4807]: E1202 21:45:03.962671 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13fdc39c_fe0b_41a8_8286_55e0fa5eb4d8.slice/crio-1ccc885b893d87abf6916d32c1de02ee530f53152ed109931b580a5c8fa432e0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13fdc39c_fe0b_41a8_8286_55e0fa5eb4d8.slice\": RecentStats: unable to find data in memory cache]" Dec 02 21:45:04 crc kubenswrapper[4807]: I1202 21:45:04.176477 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh"] Dec 02 21:45:04 crc kubenswrapper[4807]: I1202 21:45:04.188937 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411820-f5psh"] Dec 02 21:45:04 crc kubenswrapper[4807]: I1202 21:45:04.985474 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d0d1660-0666-4134-a153-421751d4eff4" path="/var/lib/kubelet/pods/6d0d1660-0666-4134-a153-421751d4eff4/volumes" Dec 02 21:45:14 crc kubenswrapper[4807]: I1202 21:45:14.649094 4807 scope.go:117] "RemoveContainer" containerID="fce359c006c6e47702482848da026e5446512965b6e4560380ededb49cc4917a" Dec 02 21:45:14 crc kubenswrapper[4807]: I1202 21:45:14.690503 4807 scope.go:117] "RemoveContainer" containerID="08b0608ecda55dc96ff4a300ea792f24825d946730ee9eddd950f584478578c8" Dec 02 21:45:14 crc kubenswrapper[4807]: I1202 21:45:14.805706 4807 scope.go:117] "RemoveContainer" containerID="a9cc8e050ab13cc58b78bba0d641d427489401b569de779264b8b3bb0e76f37a" Dec 02 21:45:14 crc kubenswrapper[4807]: I1202 21:45:14.835804 4807 scope.go:117] "RemoveContainer" containerID="045ef251a43d4058dfb494113960cc3de5b56ebefebeb0deaab0f5674305652f" Dec 02 21:45:28 crc kubenswrapper[4807]: I1202 21:45:28.326004 4807 patch_prober.go:28] interesting pod/machine-config-daemon-wb7h5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 21:45:28 crc kubenswrapper[4807]: I1202 21:45:28.326814 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 21:45:28 crc kubenswrapper[4807]: I1202 21:45:28.327255 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" Dec 02 21:45:28 crc kubenswrapper[4807]: I1202 21:45:28.340358 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85d5e6ef4dc22acb7e295b39bad89145375495d24356a92707bcc342b9f67488"} pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 21:45:28 crc kubenswrapper[4807]: I1202 21:45:28.340747 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerName="machine-config-daemon" containerID="cri-o://85d5e6ef4dc22acb7e295b39bad89145375495d24356a92707bcc342b9f67488" gracePeriod=600 Dec 02 21:45:28 crc kubenswrapper[4807]: E1202 21:45:28.466239 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:45:29 crc kubenswrapper[4807]: I1202 21:45:29.025009 4807 generic.go:334] "Generic (PLEG): container finished" podID="4aed9271-ad06-407e-b805-80c5dfea98ce" containerID="85d5e6ef4dc22acb7e295b39bad89145375495d24356a92707bcc342b9f67488" exitCode=0 Dec 02 21:45:29 crc kubenswrapper[4807]: I1202 21:45:29.025062 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" event={"ID":"4aed9271-ad06-407e-b805-80c5dfea98ce","Type":"ContainerDied","Data":"85d5e6ef4dc22acb7e295b39bad89145375495d24356a92707bcc342b9f67488"} Dec 02 21:45:29 crc kubenswrapper[4807]: I1202 21:45:29.025104 4807 scope.go:117] "RemoveContainer" containerID="7f87a6149111260dfeafef3b8b9a52a111cb2cb7329874a8edbc1ac25329e586" Dec 02 21:45:29 crc kubenswrapper[4807]: I1202 21:45:29.025795 4807 scope.go:117] "RemoveContainer" containerID="85d5e6ef4dc22acb7e295b39bad89145375495d24356a92707bcc342b9f67488" Dec 02 21:45:29 crc kubenswrapper[4807]: E1202 21:45:29.026057 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce" Dec 02 21:45:39 crc kubenswrapper[4807]: I1202 21:45:39.973224 4807 scope.go:117] "RemoveContainer" containerID="85d5e6ef4dc22acb7e295b39bad89145375495d24356a92707bcc342b9f67488" Dec 02 21:45:39 crc kubenswrapper[4807]: E1202 21:45:39.974531 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-wb7h5_openshift-machine-config-operator(4aed9271-ad06-407e-b805-80c5dfea98ce)\"" pod="openshift-machine-config-operator/machine-config-daemon-wb7h5" podUID="4aed9271-ad06-407e-b805-80c5dfea98ce"